Library / Packages

Package R yang akan digunakan pada perkuliahan Analisis Deret Waktu sesi UTS adalah: forecast, graphics, TTR, TSA . Jika package tersebut belum ada, silakan install terlebih dahulu.

#install.packages("forecast")
#install.packages("graphics")
#install.packages("TTR")
#install.packages("TSA")

Jika sudah ada, silakan panggil library package tersebut.

library("forecast")
## Registered S3 method overwritten by 'quantmod':
##   method            from
##   as.zoo.data.frame zoo
library("graphics")
library("TTR")
library("TSA")
## Registered S3 methods overwritten by 'TSA':
##   method       from    
##   fitted.Arima forecast
##   plot.Arima   forecast
## 
## Attaching package: 'TSA'
## The following objects are masked from 'package:stats':
## 
##     acf, arima
## The following object is masked from 'package:utils':
## 
##     tar

Import Data

#install.packages("rio") #install jika belum ada
library(rio)
data <- import("https://raw.githubusercontent.com/ignaciatarigan/mpdwt/main/data%20ignes.csv")
View(data)
## Warning in system2("/usr/bin/otool", c("-L", shQuote(DSO)), stdout = TRUE):
## running command ''/usr/bin/otool' -L
## '/Library/Frameworks/R.framework/Resources/modules/R_de.so'' had status 1

Eksplorasi Data

Melihat data menggunakan fungsi View(), struktur data menggunakan fungsi str(), dan dimensi data menggunakan fungsi dim().

View(data)
## Warning in system2("/usr/bin/otool", c("-L", shQuote(DSO)), stdout = TRUE):
## running command ''/usr/bin/otool' -L
## '/Library/Frameworks/R.framework/Resources/modules/R_de.so'' had status 1
str(data)
## 'data.frame':    365 obs. of  2 variables:
##  $ DT   : chr  "01/01/2021" "02/01/2021" "03/01/2021" "04/01/2021" ...
##  $ WS50M: num  5.58 4.59 3.05 2.95 3.56 3.94 9.41 5.88 6.23 3.7 ...
dim(data)
## [1] 365   2

Mengubah data agar terbaca sebagai data deret waktu dengan fungsi ts() .

data1.ts <- ts(data$WS50M)

Menampilkan ringkasan data

summary(data1.ts)
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   2.070   3.830   4.920   5.338   6.400  14.210

Membuat plot data deret waktu

ts.plot(data1.ts, xlab="Time Period ", ylab="Wind Speed", 
        main = "Time Series Plot")
points(data1.ts)

Single Moving Average & Double Moving Average

Pembagian Data

Pembagian data latih dan data uji dilakukan dengan perbandingan 80% data latih dan 20% data uji.

#membagi data latih dan data uji
training_ma <- data[1:292,]
testing_ma <- data[293:365,]
train_ma.ts <- ts(training_ma$WS50M)
test_ma.ts <- ts(testing_ma$WS50M)

Eksplorasi Data

Eksplorasi data dilakukan pada keseluruhan data, data latih serta data uji menggunakan plot data deret waktu.

#eksplorasi keseluruhan data
plot(data1.ts, col="red",main="Plot semua data")
points(data1.ts)

#eksplorasi data latih
plot(train_ma.ts, col="blue",main="Plot data latih")
points(train_ma.ts)

#eksplorasi data uji
plot(test_ma.ts, col="blue",main="Plot data uji")
points(test_ma.ts)

Single Moving Average (SMA)

Ide dasar dari Single Moving Average (SMA) adalah data suatu periode dipengaruhi oleh data periode sebelumnya. Metode pemulusan ini cocok digunakan untuk pola data stasioner atau konstan. Prinsip dasar metode pemulusan ini adalah data pemulusan pada periode ke-t merupakan rata rata dari m buah data pada periode ke-t hingga periode ke (t-m+1). Data pemulusan pada periode ke-t selanjutnya digunakan sebagai nilai peramalan pada periode ke t+1

Pemulusan menggunakan metode SMA dilakukan dengan fungsi SMA(). Dalam hal ini akan dilakukan pemulusan dengan parameter m=4.

data.sma <-SMA(train_ma.ts, n=4)
data.sma
## Time Series:
## Start = 1 
## End = 292 
## Frequency = 1 
##   [1]     NA     NA     NA 4.0425 3.5375 3.3750 4.9650 5.6975 6.3650 6.3050
##  [11] 5.1825 4.4600 3.7475 3.9525 3.8100 4.9525 4.9375 4.9825 6.1850 5.6225
##  [21] 5.4650 5.2975 4.8875 5.8075 6.4675 6.1725 6.3300 5.6700 6.2975 6.8550
##  [31] 6.2575 6.2225 6.1800 5.8675 6.3325 5.7050 4.6525 4.7525 4.0975 4.6700
##  [41] 4.6575 5.2300 4.7175 4.4750 5.2000 6.1100 7.9850 9.6025 9.4150 8.2025
##  [51] 7.7225 6.2600 6.4725 7.1950 6.9100 6.6925 6.5250 6.8400 5.9625 6.3725
##  [61] 8.1300 7.9475 7.7925 6.7475 5.2850 4.4800 4.8825 5.4000 5.6175 5.2800
##  [71] 4.8125 5.6250 5.9950 6.3925 7.3825 7.0450 5.7775 5.6250 5.0850 6.5450
##  [81] 7.1325 6.8225 7.2250 5.7175 5.7375 5.9450 6.9325 6.8750 6.1325 6.2500
##  [91] 4.6175 4.5250 4.5800 5.3725 6.4850 6.7825 6.6700 5.8100 5.2275 4.8675
## [101] 4.8700 4.6275 5.2950 6.1000 6.5725 6.6350 6.1100 6.2750 6.1650 5.9925
## [111] 5.5725 5.1650 5.1825 5.2100 5.3000 5.4400 5.6425 6.7050 8.1775 7.7125
## [121] 8.3025 8.7925 7.3675 6.9150 7.5475 6.3250 5.7450 6.0775 5.4700 5.2925
## [131] 5.2700 4.9400 4.0925 4.0875 4.6500 5.3725 7.1325 7.1900 6.6650 5.7975
## [141] 5.8900 6.2500 6.4475 6.7100 5.3375 4.7700 4.8125 4.8825 5.3225 5.4500
## [151] 5.5725 5.3975 4.4425 4.4050 5.8350 5.7625 5.3600 5.2025 3.4875 3.5075
## [161] 3.5075 3.6800 4.3050 4.6225 4.7750 4.6350 4.1500 4.2200 4.3825 5.1125
## [171] 5.3375 4.9875 4.7975 4.1625 4.1925 4.0700 3.8650 3.5000 3.1075 3.7300
## [181] 4.0000 4.3600 5.0425 4.6425 4.3125 3.9025 3.1600 3.0200 3.4600 3.7025
## [191] 3.7700 3.2800 2.8025 2.6700 2.6500 2.9075 2.9750 3.0725 3.5100 3.5950
## [201] 3.7200 3.7750 3.4450 3.5700 3.5825 3.4525 3.6075 4.1875 4.6075 5.7400
## [211] 5.7650 4.7475 4.7575 3.9200 4.5025 4.9325 4.3100 4.0750 3.3725 4.4450
## [221] 7.3900 8.9850 8.7275 7.5650 5.6950 6.2725 6.6700 6.9025 6.4200 6.1450
## [231] 7.0075 7.5875 7.9175 6.8775 5.6300 5.0675 4.8050 3.9550 4.1425 3.5425
## [241] 3.4575 3.6100 3.6575 4.0025 4.1200 4.2675 4.4025 4.9425 5.1850 5.5225
## [251] 5.3700 4.7100 4.2175 3.6150 3.4675 3.7175 3.3675 3.5825 4.1325 4.1350
## [261] 5.5300 6.4250 6.2925 6.1025 5.4025 4.6650 4.1975 4.9325 4.6975 5.0175
## [271] 5.4800 4.8125 4.3350 6.0500 6.0275 6.1475 6.9425 4.9125 5.0850 5.0850
## [281] 4.8475 4.9875 4.6825 5.1000 5.3200 5.3500 5.1375 4.2850 4.0700 4.7625
## [291] 5.4925 6.0200

Data pemulusan pada periode ke-t selanjutnya digunakan sebagai nilai peramalan pada periode ke t+1 sehingga hasil peramalan 1 periode kedepan adalah sebagai berikut.

data.ramal<-c(NA,data.sma)
data.ramal #forecast 1 periode ke depan
##   [1]     NA     NA     NA     NA 4.0425 3.5375 3.3750 4.9650 5.6975 6.3650
##  [11] 6.3050 5.1825 4.4600 3.7475 3.9525 3.8100 4.9525 4.9375 4.9825 6.1850
##  [21] 5.6225 5.4650 5.2975 4.8875 5.8075 6.4675 6.1725 6.3300 5.6700 6.2975
##  [31] 6.8550 6.2575 6.2225 6.1800 5.8675 6.3325 5.7050 4.6525 4.7525 4.0975
##  [41] 4.6700 4.6575 5.2300 4.7175 4.4750 5.2000 6.1100 7.9850 9.6025 9.4150
##  [51] 8.2025 7.7225 6.2600 6.4725 7.1950 6.9100 6.6925 6.5250 6.8400 5.9625
##  [61] 6.3725 8.1300 7.9475 7.7925 6.7475 5.2850 4.4800 4.8825 5.4000 5.6175
##  [71] 5.2800 4.8125 5.6250 5.9950 6.3925 7.3825 7.0450 5.7775 5.6250 5.0850
##  [81] 6.5450 7.1325 6.8225 7.2250 5.7175 5.7375 5.9450 6.9325 6.8750 6.1325
##  [91] 6.2500 4.6175 4.5250 4.5800 5.3725 6.4850 6.7825 6.6700 5.8100 5.2275
## [101] 4.8675 4.8700 4.6275 5.2950 6.1000 6.5725 6.6350 6.1100 6.2750 6.1650
## [111] 5.9925 5.5725 5.1650 5.1825 5.2100 5.3000 5.4400 5.6425 6.7050 8.1775
## [121] 7.7125 8.3025 8.7925 7.3675 6.9150 7.5475 6.3250 5.7450 6.0775 5.4700
## [131] 5.2925 5.2700 4.9400 4.0925 4.0875 4.6500 5.3725 7.1325 7.1900 6.6650
## [141] 5.7975 5.8900 6.2500 6.4475 6.7100 5.3375 4.7700 4.8125 4.8825 5.3225
## [151] 5.4500 5.5725 5.3975 4.4425 4.4050 5.8350 5.7625 5.3600 5.2025 3.4875
## [161] 3.5075 3.5075 3.6800 4.3050 4.6225 4.7750 4.6350 4.1500 4.2200 4.3825
## [171] 5.1125 5.3375 4.9875 4.7975 4.1625 4.1925 4.0700 3.8650 3.5000 3.1075
## [181] 3.7300 4.0000 4.3600 5.0425 4.6425 4.3125 3.9025 3.1600 3.0200 3.4600
## [191] 3.7025 3.7700 3.2800 2.8025 2.6700 2.6500 2.9075 2.9750 3.0725 3.5100
## [201] 3.5950 3.7200 3.7750 3.4450 3.5700 3.5825 3.4525 3.6075 4.1875 4.6075
## [211] 5.7400 5.7650 4.7475 4.7575 3.9200 4.5025 4.9325 4.3100 4.0750 3.3725
## [221] 4.4450 7.3900 8.9850 8.7275 7.5650 5.6950 6.2725 6.6700 6.9025 6.4200
## [231] 6.1450 7.0075 7.5875 7.9175 6.8775 5.6300 5.0675 4.8050 3.9550 4.1425
## [241] 3.5425 3.4575 3.6100 3.6575 4.0025 4.1200 4.2675 4.4025 4.9425 5.1850
## [251] 5.5225 5.3700 4.7100 4.2175 3.6150 3.4675 3.7175 3.3675 3.5825 4.1325
## [261] 4.1350 5.5300 6.4250 6.2925 6.1025 5.4025 4.6650 4.1975 4.9325 4.6975
## [271] 5.0175 5.4800 4.8125 4.3350 6.0500 6.0275 6.1475 6.9425 4.9125 5.0850
## [281] 5.0850 4.8475 4.9875 4.6825 5.1000 5.3200 5.3500 5.1375 4.2850 4.0700
## [291] 4.7625 5.4925 6.0200

Selanjutnya akan dilakukan peramalan sejumlah data uji yaitu 24 periode. Pada metode SMA, hasil peramalan 24 periode ke depan akan bernilai sama dengan hasil peramalan 1 periode kedepan. Dalam hal ini akan dilakukan pengguabungan data aktual train, data hasil pemulusan dan data hasil ramalan 24 periode kedepan.

data.gab<-cbind(aktual=c(train_ma.ts,rep(NA,24)),pemulusan=c(data.sma,rep(NA,24)),ramalan=c(data.ramal,rep(data.ramal[length(data.ramal)],23)))
data.gab #forecast 24 periode ke depan
##        aktual pemulusan ramalan
##   [1,]   5.58        NA      NA
##   [2,]   4.59        NA      NA
##   [3,]   3.05        NA      NA
##   [4,]   2.95    4.0425      NA
##   [5,]   3.56    3.5375  4.0425
##   [6,]   3.94    3.3750  3.5375
##   [7,]   9.41    4.9650  3.3750
##   [8,]   5.88    5.6975  4.9650
##   [9,]   6.23    6.3650  5.6975
##  [10,]   3.70    6.3050  6.3650
##  [11,]   4.92    5.1825  6.3050
##  [12,]   2.99    4.4600  5.1825
##  [13,]   3.38    3.7475  4.4600
##  [14,]   4.52    3.9525  3.7475
##  [15,]   4.35    3.8100  3.9525
##  [16,]   7.56    4.9525  3.8100
##  [17,]   3.32    4.9375  4.9525
##  [18,]   4.70    4.9825  4.9375
##  [19,]   9.16    6.1850  4.9825
##  [20,]   5.31    5.6225  6.1850
##  [21,]   2.69    5.4650  5.6225
##  [22,]   4.03    5.2975  5.4650
##  [23,]   7.52    4.8875  5.2975
##  [24,]   8.99    5.8075  4.8875
##  [25,]   5.33    6.4675  5.8075
##  [26,]   2.85    6.1725  6.4675
##  [27,]   8.15    6.3300  6.1725
##  [28,]   6.35    5.6700  6.3300
##  [29,]   7.84    6.2975  5.6700
##  [30,]   5.08    6.8550  6.2975
##  [31,]   5.76    6.2575  6.8550
##  [32,]   6.21    6.2225  6.2575
##  [33,]   7.67    6.1800  6.2225
##  [34,]   3.83    5.8675  6.1800
##  [35,]   7.62    6.3325  5.8675
##  [36,]   3.70    5.7050  6.3325
##  [37,]   3.46    4.6525  5.7050
##  [38,]   4.23    4.7525  4.6525
##  [39,]   5.00    4.0975  4.7525
##  [40,]   5.99    4.6700  4.0975
##  [41,]   3.41    4.6575  4.6700
##  [42,]   6.52    5.2300  4.6575
##  [43,]   2.95    4.7175  5.2300
##  [44,]   5.02    4.4750  4.7175
##  [45,]   6.31    5.2000  4.4750
##  [46,]  10.16    6.1100  5.2000
##  [47,]  10.45    7.9850  6.1100
##  [48,]  11.49    9.6025  7.9850
##  [49,]   5.56    9.4150  9.6025
##  [50,]   5.31    8.2025  9.4150
##  [51,]   8.53    7.7225  8.2025
##  [52,]   5.64    6.2600  7.7225
##  [53,]   6.41    6.4725  6.2600
##  [54,]   8.20    7.1950  6.4725
##  [55,]   7.39    6.9100  7.1950
##  [56,]   4.77    6.6925  6.9100
##  [57,]   5.74    6.5250  6.6925
##  [58,]   9.46    6.8400  6.5250
##  [59,]   3.88    5.9625  6.8400
##  [60,]   6.41    6.3725  5.9625
##  [61,]  12.77    8.1300  6.3725
##  [62,]   8.73    7.9475  8.1300
##  [63,]   3.26    7.7925  7.9475
##  [64,]   2.23    6.7475  7.7925
##  [65,]   6.92    5.2850  6.7475
##  [66,]   5.51    4.4800  5.2850
##  [67,]   4.87    4.8825  4.4800
##  [68,]   4.30    5.4000  4.8825
##  [69,]   7.79    5.6175  5.4000
##  [70,]   4.16    5.2800  5.6175
##  [71,]   3.00    4.8125  5.2800
##  [72,]   7.55    5.6250  4.8125
##  [73,]   9.27    5.9950  5.6250
##  [74,]   5.75    6.3925  5.9950
##  [75,]   6.96    7.3825  6.3925
##  [76,]   6.20    7.0450  7.3825
##  [77,]   4.20    5.7775  7.0450
##  [78,]   5.14    5.6250  5.7775
##  [79,]   4.80    5.0850  5.6250
##  [80,]  12.04    6.5450  5.0850
##  [81,]   6.55    7.1325  6.5450
##  [82,]   3.90    6.8225  7.1325
##  [83,]   6.41    7.2250  6.8225
##  [84,]   6.01    5.7175  7.2250
##  [85,]   6.63    5.7375  5.7175
##  [86,]   4.73    5.9450  5.7375
##  [87,]  10.36    6.9325  5.9450
##  [88,]   5.78    6.8750  6.9325
##  [89,]   3.66    6.1325  6.8750
##  [90,]   5.20    6.2500  6.1325
##  [91,]   3.83    4.6175  6.2500
##  [92,]   5.41    4.5250  4.6175
##  [93,]   3.88    4.5800  4.5250
##  [94,]   8.37    5.3725  4.5800
##  [95,]   8.28    6.4850  5.3725
##  [96,]   6.60    6.7825  6.4850
##  [97,]   3.43    6.6700  6.7825
##  [98,]   4.93    5.8100  6.6700
##  [99,]   5.95    5.2275  5.8100
## [100,]   5.16    4.8675  5.2275
## [101,]   3.44    4.8700  4.8675
## [102,]   3.96    4.6275  4.8700
## [103,]   8.62    5.2950  4.6275
## [104,]   8.38    6.1000  5.2950
## [105,]   5.33    6.5725  6.1000
## [106,]   4.21    6.6350  6.5725
## [107,]   6.52    6.1100  6.6350
## [108,]   9.04    6.2750  6.1100
## [109,]   4.89    6.1650  6.2750
## [110,]   3.52    5.9925  6.1650
## [111,]   4.84    5.5725  5.9925
## [112,]   7.41    5.1650  5.5725
## [113,]   4.96    5.1825  5.1650
## [114,]   3.63    5.2100  5.1825
## [115,]   5.20    5.3000  5.2100
## [116,]   7.97    5.4400  5.3000
## [117,]   5.77    5.6425  5.4400
## [118,]   7.88    6.7050  5.6425
## [119,]  11.09    8.1775  6.7050
## [120,]   6.11    7.7125  8.1775
## [121,]   8.13    8.3025  7.7125
## [122,]   9.84    8.7925  8.3025
## [123,]   5.39    7.3675  8.7925
## [124,]   4.30    6.9150  7.3675
## [125,]  10.66    7.5475  6.9150
## [126,]   4.95    6.3250  7.5475
## [127,]   3.07    5.7450  6.3250
## [128,]   5.63    6.0775  5.7450
## [129,]   8.23    5.4700  6.0775
## [130,]   4.24    5.2925  5.4700
## [131,]   2.98    5.2700  5.2925
## [132,]   4.31    4.9400  5.2700
## [133,]   4.84    4.0925  4.9400
## [134,]   4.22    4.0875  4.0925
## [135,]   5.23    4.6500  4.0875
## [136,]   7.20    5.3725  4.6500
## [137,]  11.88    7.1325  5.3725
## [138,]   4.45    7.1900  7.1325
## [139,]   3.13    6.6650  7.1900
## [140,]   3.73    5.7975  6.6650
## [141,]  12.25    5.8900  5.7975
## [142,]   5.89    6.2500  5.8900
## [143,]   3.92    6.4475  6.2500
## [144,]   4.78    6.7100  6.4475
## [145,]   6.76    5.3375  6.7100
## [146,]   3.62    4.7700  5.3375
## [147,]   4.09    4.8125  4.7700
## [148,]   5.06    4.8825  4.8125
## [149,]   8.52    5.3225  4.8825
## [150,]   4.13    5.4500  5.3225
## [151,]   4.58    5.5725  5.4500
## [152,]   4.36    5.3975  5.5725
## [153,]   4.70    4.4425  5.3975
## [154,]   3.98    4.4050  4.4425
## [155,]  10.30    5.8350  4.4050
## [156,]   4.07    5.7625  5.8350
## [157,]   3.09    5.3600  5.7625
## [158,]   3.35    5.2025  5.3600
## [159,]   3.44    3.4875  5.2025
## [160,]   4.15    3.5075  3.4875
## [161,]   3.09    3.5075  3.5075
## [162,]   4.04    3.6800  3.5075
## [163,]   5.94    4.3050  3.6800
## [164,]   5.42    4.6225  4.3050
## [165,]   3.70    4.7750  4.6225
## [166,]   3.48    4.6350  4.7750
## [167,]   4.00    4.1500  4.6350
## [168,]   5.70    4.2200  4.1500
## [169,]   4.35    4.3825  4.2200
## [170,]   6.40    5.1125  4.3825
## [171,]   4.90    5.3375  5.1125
## [172,]   4.30    4.9875  5.3375
## [173,]   3.59    4.7975  4.9875
## [174,]   3.86    4.1625  4.7975
## [175,]   5.02    4.1925  4.1625
## [176,]   3.81    4.0700  4.1925
## [177,]   2.77    3.8650  4.0700
## [178,]   2.40    3.5000  3.8650
## [179,]   3.45    3.1075  3.5000
## [180,]   6.30    3.7300  3.1075
## [181,]   3.85    4.0000  3.7300
## [182,]   3.84    4.3600  4.0000
## [183,]   6.18    5.0425  4.3600
## [184,]   4.70    4.6425  5.0425
## [185,]   2.53    4.3125  4.6425
## [186,]   2.20    3.9025  4.3125
## [187,]   3.21    3.1600  3.9025
## [188,]   4.14    3.0200  3.1600
## [189,]   4.29    3.4600  3.0200
## [190,]   3.17    3.7025  3.4600
## [191,]   3.48    3.7700  3.7025
## [192,]   2.18    3.2800  3.7700
## [193,]   2.38    2.8025  3.2800
## [194,]   2.64    2.6700  2.8025
## [195,]   3.40    2.6500  2.6700
## [196,]   3.21    2.9075  2.6500
## [197,]   2.65    2.9750  2.9075
## [198,]   3.03    3.0725  2.9750
## [199,]   5.15    3.5100  3.0725
## [200,]   3.55    3.5950  3.5100
## [201,]   3.15    3.7200  3.5950
## [202,]   3.25    3.7750  3.7200
## [203,]   3.83    3.4450  3.7750
## [204,]   4.05    3.5700  3.4450
## [205,]   3.20    3.5825  3.5700
## [206,]   2.73    3.4525  3.5825
## [207,]   4.45    3.6075  3.4525
## [208,]   6.37    4.1875  3.6075
## [209,]   4.88    4.6075  4.1875
## [210,]   7.26    5.7400  4.6075
## [211,]   4.55    5.7650  5.7400
## [212,]   2.30    4.7475  5.7650
## [213,]   4.92    4.7575  4.7475
## [214,]   3.91    3.9200  4.7575
## [215,]   6.88    4.5025  3.9200
## [216,]   4.02    4.9325  4.5025
## [217,]   2.43    4.3100  4.9325
## [218,]   2.97    4.0750  4.3100
## [219,]   4.07    3.3725  4.0750
## [220,]   8.31    4.4450  3.3725
## [221,]  14.21    7.3900  4.4450
## [222,]   9.35    8.9850  7.3900
## [223,]   3.04    8.7275  8.9850
## [224,]   3.66    7.5650  8.7275
## [225,]   6.73    5.6950  7.5650
## [226,]  11.66    6.2725  5.6950
## [227,]   4.63    6.6700  6.2725
## [228,]   4.59    6.9025  6.6700
## [229,]   4.80    6.4200  6.9025
## [230,]  10.56    6.1450  6.4200
## [231,]   8.08    7.0075  6.1450
## [232,]   6.91    7.5875  7.0075
## [233,]   6.12    7.9175  7.5875
## [234,]   6.40    6.8775  7.9175
## [235,]   3.09    5.6300  6.8775
## [236,]   4.66    5.0675  5.6300
## [237,]   5.07    4.8050  5.0675
## [238,]   3.00    3.9550  4.8050
## [239,]   3.84    4.1425  3.9550
## [240,]   2.26    3.5425  4.1425
## [241,]   4.73    3.4575  3.5425
## [242,]   3.61    3.6100  3.4575
## [243,]   4.03    3.6575  3.6100
## [244,]   3.64    4.0025  3.6575
## [245,]   5.20    4.1200  4.0025
## [246,]   4.20    4.2675  4.1200
## [247,]   4.57    4.4025  4.2675
## [248,]   5.80    4.9425  4.4025
## [249,]   6.17    5.1850  4.9425
## [250,]   5.55    5.5225  5.1850
## [251,]   3.96    5.3700  5.5225
## [252,]   3.16    4.7100  5.3700
## [253,]   4.20    4.2175  4.7100
## [254,]   3.14    3.6150  4.2175
## [255,]   3.37    3.4675  3.6150
## [256,]   4.16    3.7175  3.4675
## [257,]   2.80    3.3675  3.7175
## [258,]   4.00    3.5825  3.3675
## [259,]   5.57    4.1325  3.5825
## [260,]   4.17    4.1350  4.1325
## [261,]   8.38    5.5300  4.1350
## [262,]   7.58    6.4250  5.5300
## [263,]   5.04    6.2925  6.4250
## [264,]   3.41    6.1025  6.2925
## [265,]   5.58    5.4025  6.1025
## [266,]   4.63    4.6650  5.4025
## [267,]   3.17    4.1975  4.6650
## [268,]   6.35    4.9325  4.1975
## [269,]   4.64    4.6975  4.9325
## [270,]   5.91    5.0175  4.6975
## [271,]   5.02    5.4800  5.0175
## [272,]   3.68    4.8125  5.4800
## [273,]   2.73    4.3350  4.8125
## [274,]  12.77    6.0500  4.3350
## [275,]   4.93    6.0275  6.0500
## [276,]   4.16    6.1475  6.0275
## [277,]   5.91    6.9425  6.1475
## [278,]   4.65    4.9125  6.9425
## [279,]   5.62    5.0850  4.9125
## [280,]   4.16    5.0850  5.0850
## [281,]   4.96    4.8475  5.0850
## [282,]   5.21    4.9875  4.8475
## [283,]   4.40    4.6825  4.9875
## [284,]   5.83    5.1000  4.6825
## [285,]   5.84    5.3200  5.1000
## [286,]   5.33    5.3500  5.3200
## [287,]   3.55    5.1375  5.3500
## [288,]   2.42    4.2850  5.1375
## [289,]   4.98    4.0700  4.2850
## [290,]   8.10    4.7625  4.0700
## [291,]   6.47    5.4925  4.7625
## [292,]   4.53    6.0200  5.4925
## [293,]     NA        NA  6.0200
## [294,]     NA        NA  6.0200
## [295,]     NA        NA  6.0200
## [296,]     NA        NA  6.0200
## [297,]     NA        NA  6.0200
## [298,]     NA        NA  6.0200
## [299,]     NA        NA  6.0200
## [300,]     NA        NA  6.0200
## [301,]     NA        NA  6.0200
## [302,]     NA        NA  6.0200
## [303,]     NA        NA  6.0200
## [304,]     NA        NA  6.0200
## [305,]     NA        NA  6.0200
## [306,]     NA        NA  6.0200
## [307,]     NA        NA  6.0200
## [308,]     NA        NA  6.0200
## [309,]     NA        NA  6.0200
## [310,]     NA        NA  6.0200
## [311,]     NA        NA  6.0200
## [312,]     NA        NA  6.0200
## [313,]     NA        NA  6.0200
## [314,]     NA        NA  6.0200
## [315,]     NA        NA  6.0200
## [316,]     NA        NA  6.0200

Adapun plot data deret waktu dari hasil peramalan yang dilakukan adalah sebagai berikut.

ts.plot(data1.ts, xlab="Time Period ", ylab="Wind Speed", main= "SMA N=4 Data Wind Speed")
points(data1.ts)
lines(data.gab[,2],col="green",lwd=2)
lines(data.gab[,3],col="red",lwd=2)
legend("topleft",c("data aktual","data pemulusan","data peramalan"), lty=8, col=c("black","green","red"), cex=0.5)

Selanjutnya perhitungan akurasi dilakukan dengan ukuran akurasi Sum Squares Error (SSE), Mean Square Error (MSE) dan Mean Absolute Percentage Error (MAPE). Perhitungan akurasi dilakukan baik pada data latih maupun pada data uji.

#Menghitung nilai keakuratan data latih
error_train.sma = train_ma.ts-data.ramal[1:length(train_ma.ts)]
SSE_train.sma = sum(error_train.sma[5:length(train_ma.ts)]^2)
MSE_train.sma = mean(error_train.sma[5:length(train_ma.ts)]^2)
MAPE_train.sma = mean(abs((error_train.sma[5:length(train_ma.ts)]/train_ma.ts[5:length(train_ma.ts)])*100))

akurasi_train.sma <- matrix(c(SSE_train.sma, MSE_train.sma, MAPE_train.sma))
row.names(akurasi_train.sma)<- c("SSE", "MSE", "MAPE")
colnames(akurasi_train.sma) <- c("Akurasi m = 4")
akurasi_train.sma
##      Akurasi m = 4
## SSE    1526.217325
## MSE       5.299366
## MAPE     33.695534

Dalam hal ini nilai MAPE data latih pada metode pemulusan SMA lebih dari 10%, nilai ini dapat dikategorikan sebagai nilai akurasi yang tidak baik. Selanjutnya dilakukan perhitungan nilai MAPE data uji pada metde pemulusan SMA.

#Menghitung nilai keakuratan data uji
error_test.sma = test_ma.ts-data.gab[97:120,3]
## Warning in `-.default`(test_ma.ts, data.gab[97:120, 3]): longer object length
## is not a multiple of shorter object length
SSE_test.sma = sum(error_test.sma^2)
MSE_test.sma = mean(error_test.sma^2)
MAPE_test.sma = mean(abs((error_test.sma/test_ma.ts*100)))

akurasi_test.sma <- matrix(c(SSE_test.sma, MSE_test.sma, MAPE_test.sma))
row.names(akurasi_test.sma)<- c("SSE", "MSE", "MAPE")
colnames(akurasi_test.sma) <- c("Akurasi m = 4")
akurasi_test.sma
##      Akurasi m = 4
## SSE     274.593744
## MSE       3.761558
## MAPE     36.334402

Perhitungan akurasi menggunakan data latih menghasilkan nilai MAPE yang masih lebih dari 10% sehingga nilai akurasi ini dapat dikategorikan tidak baik.

Double Moving Average (DMA)

Metode pemulusan Double Moving Average (DMA) pada dasarnya mirip dengan SMA. Namun demikian, metode ini lebih cocok digunakan untuk pola data trend. Proses pemulusan dengan rata rata dalam metode ini dilakukan sebanyak 2 kali.

dma <- SMA(data.sma, n = 4)
At <- 2*data.sma - dma
Bt <- 2/(4-1)*(data.sma - dma)
data.dma<- At+Bt
data.ramal2<- c(NA, data.dma)

t = 1:24
f = c()

for (i in t) {
  f[i] = At[length(At)] + Bt[length(Bt)]*(i)
}

data.gab2 <- cbind(aktual = c(train_ma.ts,rep(NA,24)), pemulusan1 = c(data.sma,rep(NA,24)),pemulusan2 = c(data.dma, rep(NA,24)),At = c(At, rep(NA,24)), Bt = c(Bt,rep(NA,24)),ramalan = c(data.ramal2, f[-1]))
data.gab2
##        aktual pemulusan1 pemulusan2        At           Bt   ramalan
##   [1,]   5.58         NA         NA        NA           NA        NA
##   [2,]   4.59         NA         NA        NA           NA        NA
##   [3,]   3.05         NA         NA        NA           NA        NA
##   [4,]   2.95     4.0425         NA        NA           NA        NA
##   [5,]   3.56     3.5375         NA        NA           NA        NA
##   [6,]   3.94     3.3750         NA        NA           NA        NA
##   [7,]   9.41     4.9650   6.606667  5.950000  0.656666667        NA
##   [8,]   5.88     5.6975   7.870417  7.001250  0.869166667  6.606667
##   [9,]   6.23     6.3650   8.472292  7.629375  0.842916667  7.870417
##  [10,]   3.70     6.3050   7.091458  6.776875  0.314583333  8.472292
##  [11,]   4.92     5.1825   4.007500  4.477500 -0.470000000  7.091458
##  [12,]   2.99     4.4600   2.596458  3.341875 -0.745416667  4.007500
##  [13,]   3.38     3.7475   1.787083  2.571250 -0.784166667  2.596458
##  [14,]   4.52     3.9525   3.313958  3.569375 -0.255416667  1.787083
##  [15,]   4.35     3.8100   3.505833  3.627500 -0.121666667  3.313958
##  [16,]   7.56     4.9525   6.347292  5.789375  0.557916667  3.505833
##  [17,]   3.32     4.9375   5.811458  5.461875  0.349583333  6.347292
##  [18,]   4.70     4.9825   5.502292  5.294375  0.207916667  5.811458
##  [19,]   9.16     6.1850   7.719375  7.105625  0.613750000  5.502292
##  [20,]   5.31     5.6225   5.940208  5.813125  0.127083333  7.719375
##  [21,]   2.69     5.4650   5.300417  5.366250 -0.065833333  5.940208
##  [22,]   4.03     5.2975   4.722500  4.952500 -0.230000000  5.300417
##  [23,]   7.52     4.8875   4.169792  4.456875 -0.287083333  4.722500
##  [24,]   8.99     5.8075   6.546042  6.250625  0.295416667  4.169792
##  [25,]   5.33     6.4675   7.888333  7.320000  0.568333333  6.546042
##  [26,]   2.85     6.1725   6.737083  6.511250  0.225833333  7.888333
##  [27,]   8.15     6.3300   6.556042  6.465625  0.090416667  6.737083
##  [28,]   6.35     5.6700   4.853333  5.180000 -0.326666667  6.556042
##  [29,]   7.84     6.2975   6.597500  6.477500  0.120000000  4.853333
##  [30,]   5.08     6.8550   7.799792  7.421875  0.377916667  6.597500
##  [31,]   5.76     6.2575   6.236667  6.245000 -0.008333333  7.799792
##  [32,]   6.21     6.2225   5.913125  6.036875 -0.123750000  6.236667
##  [33,]   7.67     6.1800   5.848750  5.981250 -0.132500000  5.913125
##  [34,]   3.83     5.8675   5.426875  5.603125 -0.176250000  5.848750
##  [35,]   7.62     6.3325   6.635625  6.514375  0.121250000  5.426875
##  [36,]   3.70     5.7050   5.177917  5.388750 -0.210833333  6.635625
##  [37,]   3.46     4.6525   3.007708  3.665625 -0.657916667  5.177917
##  [38,]   4.23     4.7525   3.738958  4.144375 -0.405416667  3.007708
##  [39,]   5.00     4.0975   2.923542  3.393125 -0.469583333  3.738958
##  [40,]   5.99     4.6700   4.881458  4.796875  0.084583333  2.923542
##  [41,]   3.41     4.6575   4.846042  4.770625  0.075416667  4.881458
##  [42,]   6.52     5.2300   6.173750  5.796250  0.377500000  4.846042
##  [43,]   2.95     4.7175   4.548750  4.616250 -0.067500000  6.173750
##  [44,]   5.02     4.4750   3.983333  4.180000 -0.196666667  4.548750
##  [45,]   6.31     5.2000   5.690625  5.494375  0.196250000  3.983333
##  [46,]  10.16     6.1100   7.750625  7.094375  0.656250000  5.690625
##  [47,]  10.45     7.9850  11.389167 10.027500  1.361666667  7.750625
##  [48,]  11.49     9.6025  13.566042 11.980625  1.585416667 11.389167
##  [49,]   5.56     9.4150  11.309792 10.551875  0.757916667 13.566042
##  [50,]   5.31     8.2025   7.204583  7.603750 -0.399166667 11.309792
##  [51,]   8.53     7.7225   6.033958  6.709375 -0.675416667  7.204583
##  [52,]   5.64     6.2600   3.526667  4.620000 -1.093333333  6.033958
##  [53,]   6.41     6.4725   5.319375  5.780625 -0.461250000  3.526667
##  [54,]   8.20     7.1950   7.665833  7.477500  0.188333333  5.319375
##  [55,]   7.39     6.9100   7.244375  7.110625  0.133750000  7.665833
##  [56,]   4.77     6.6925   6.484167  6.567500 -0.083333333  7.244375
##  [57,]   5.74     6.5250   6.015625  6.219375 -0.203750000  6.484167
##  [58,]   9.46     6.8400   7.003542  6.938125  0.065416667  6.015625
##  [59,]   3.88     5.9625   5.058333  5.420000 -0.361666667  7.003542
##  [60,]   6.41     6.3725   6.285000  6.320000 -0.035000000  5.058333
##  [61,]  12.77     8.1300  10.302917  9.433750  0.869166667  6.285000
##  [62,]   8.73     7.9475   9.354792  8.791875  0.562916667 10.302917
##  [63,]   3.26     7.7925   8.178958  8.024375  0.154583333  9.354792
##  [64,]   2.23     6.7475   5.236042  5.840625 -0.604583333  8.178958
##  [65,]   6.92     5.2850   2.521458  3.626875 -1.105416667  5.236042
##  [66,]   5.51     4.4800   1.819583  2.883750 -1.064166667  2.521458
##  [67,]   4.87     4.8825   4.105417  4.416250 -0.310833333  1.819583
##  [68,]   4.30     5.4000   6.046875  5.788125  0.258750000  4.105417
##  [69,]   7.79     5.6175   6.488333  6.140000  0.348333333  6.046875
##  [70,]   4.16     5.2800   5.255000  5.265000 -0.010000000  6.488333
##  [71,]   3.00     4.8125   4.037500  4.347500 -0.310000000  5.255000
##  [72,]   7.55     5.6250   6.110417  5.916250  0.194166667  4.037500
##  [73,]   9.27     5.9950   6.939792  6.561875  0.377916667  6.110417
##  [74,]   5.75     6.3925   7.536250  7.078750  0.457500000  6.939792
##  [75,]   6.96     7.3825   9.105417  8.416250  0.689166667  7.536250
##  [76,]   6.20     7.0450   7.613750  7.386250  0.227500000  9.105417
##  [77,]   4.20     5.7775   4.324375  4.905625 -0.581250000  7.613750
##  [78,]   5.14     5.6250   4.237500  4.792500 -0.555000000  4.324375
##  [79,]   4.80     5.0850   3.754792  4.286875 -0.532083333  4.237500
##  [80,]  12.04     6.5450   7.856458  7.331875  0.524583333  3.754792
##  [81,]   6.55     7.1325   8.858542  8.168125  0.690416667  7.856458
##  [82,]   3.90     6.8225   7.532917  7.248750  0.284166667  8.858542
##  [83,]   6.41     7.2250   7.714583  7.518750  0.195833333  7.532917
##  [84,]   6.01     5.7175   4.039375  4.710625 -0.671250000  7.714583
##  [85,]   6.63     5.7375   4.673958  5.099375 -0.425416667  4.039375
##  [86,]   4.73     5.9450   5.592917  5.733750 -0.140833333  4.673958
##  [87,]  10.36     6.9325   8.348125  7.781875  0.566250000  5.592917
##  [88,]   5.78     6.8750   7.712500  7.377500  0.335000000  8.348125
##  [89,]   3.66     6.1325   5.567917  5.793750 -0.225833333  7.712500
##  [90,]   5.20     6.2500   5.754167  5.952500 -0.198333333  5.567917
##  [91,]   3.83     4.6175   2.365417  3.266250 -0.900833333  5.754167
##  [92,]   5.41     4.5250   3.097917  3.668750 -0.570833333  2.365417
##  [93,]   3.88     4.5800   3.891458  4.166875 -0.275416667  3.097917
##  [94,]   8.37     5.3725   6.370417  5.971250  0.399166667  3.891458
##  [95,]   8.28     6.4850   8.558958  7.729375  0.829583333  6.370417
##  [96,]   6.60     6.7825   8.411667  7.760000  0.651666667  8.558958
##  [97,]   3.43     6.6700   7.240833  7.012500  0.228333333  8.411667
##  [98,]   4.93     5.8100   4.765208  5.183125 -0.417916667  7.240833
##  [99,]   5.95     5.2275   3.735833  4.332500 -0.596666667  4.765208
## [100,]   5.16     4.8675   3.573750  4.091250 -0.517500000  3.735833
## [101,]   3.44     4.8700   4.330417  4.546250 -0.215833333  3.573750
## [102,]   3.96     4.6275   4.176458  4.356875 -0.180416667  4.330417
## [103,]   8.62     5.2950   5.928333  5.675000  0.253333333  4.176458
## [104,]   8.38     6.1000   7.561458  6.976875  0.584583333  5.928333
## [105,]   5.33     6.5725   8.112083  7.496250  0.615833333  7.561458
## [106,]   4.21     6.6350   7.442292  7.119375  0.322916667  8.112083
## [107,]   6.52     6.1100   5.702708  5.865625 -0.162916667  7.442292
## [108,]   9.04     6.2750   6.069792  6.151875 -0.082083333  5.702708
## [109,]   4.89     6.1650   5.946250  6.033750 -0.087500000  6.069792
## [110,]   3.52     5.9925   5.753958  5.849375 -0.095416667  5.946250
## [111,]   4.84     5.5725   4.857917  5.143750 -0.285833333  5.753958
## [112,]   7.41     5.1650   4.233750  4.606250 -0.372500000  4.857917
## [113,]   4.96     5.1825   4.689792  4.886875 -0.197083333  4.233750
## [114,]   3.63     5.2100   5.089167  5.137500 -0.048333333  4.689792
## [115,]   5.20     5.3000   5.442708  5.385625  0.057083333  5.089167
## [116,]   7.97     5.4400   5.701458  5.596875  0.104583333  5.442708
## [117,]   5.77     5.6425   6.049792  5.886875  0.162916667  5.701458
## [118,]   7.88     6.7050   8.260208  7.638125  0.622083333  6.049792
## [119,]  11.09     8.1775  10.987917  9.863750  1.124166667  8.260208
## [120,]   6.11     7.7125   8.801042  8.365625  0.435416667 10.987917
## [121,]   8.13     8.3025   9.266042  8.880625  0.385416667  8.801042
## [122,]   9.84     8.7925   9.702917  9.338750  0.364166667  9.266042
## [123,]   5.39     7.3675   6.240417  6.691250 -0.450833333  9.702917
## [124,]   4.30     6.9150   5.366042  5.985625 -0.619583333  6.240417
## [125,]  10.66     7.5475   7.367292  7.439375 -0.072083333  5.366042
## [126,]   4.95     6.3250   5.135417  5.611250 -0.475833333  7.367292
## [127,]   3.07     5.7450   4.264792  4.856875 -0.592083333  5.135417
## [128,]   5.63     6.0775   5.500417  5.731250 -0.230833333  4.264792
## [129,]   8.23     5.4700   4.746042  5.035625 -0.289583333  5.500417
## [130,]   4.24     5.2925   4.702917  4.938750 -0.235833333  4.746042
## [131,]   2.98     5.2700   4.840833  5.012500 -0.171666667  4.702917
## [132,]   4.31     4.9400   4.434792  4.636875 -0.202083333  4.840833
## [133,]   4.84     4.0925   2.748750  3.286250 -0.537500000  4.434792
## [134,]   4.22     4.0875   3.237500  3.577500 -0.340000000  2.748750
## [135,]   5.23     4.6500   4.995833  4.857500  0.138333333  3.237500
## [136,]   7.20     5.3725   6.742292  6.194375  0.547916667  4.995833
## [137,]  11.88     7.1325  10.168958  8.954375  1.214583333  6.742292
## [138,]   4.45     7.1900   9.029583  8.293750  0.735833333 10.168958
## [139,]   3.13     6.6650   6.790000  6.740000  0.050000000  9.029583
## [140,]   3.73     5.7975   4.299583  4.898750 -0.599166667  6.790000
## [141,]  12.25     5.8900   5.063958  5.394375 -0.330416667  4.299583
## [142,]   5.89     6.2500   6.415625  6.349375  0.066250000  5.063958
## [143,]   3.92     6.4475   7.032917  6.798750  0.234166667  6.415625
## [144,]   4.78     6.7100   7.352708  7.095625  0.257083333  7.032917
## [145,]   6.76     5.3375   3.922917  4.488750 -0.565833333  7.352708
## [146,]   3.62     4.7700   3.026250  3.723750 -0.697500000  3.922917
## [147,]   4.09     4.8125   3.820833  4.217500 -0.396666667  3.026250
## [148,]   5.06     4.8825   4.768958  4.814375 -0.045416667  3.820833
## [149,]   8.52     5.3225   5.948542  5.698125  0.250416667  4.768958
## [150,]   4.13     5.4500   6.005208  5.783125  0.222083333  5.948542
## [151,]   4.58     5.5725   6.015208  5.838125  0.177083333  6.005208
## [152,]   4.36     5.3975   5.333958  5.359375 -0.025416667  6.015208
## [153,]   4.70     4.4425   3.153958  3.669375 -0.515416667  5.333958
## [154,]   3.98     4.4050   3.489375  3.855625 -0.366250000  3.153958
## [155,]  10.30     5.8350   7.193333  6.650000  0.543333333  3.489375
## [156,]   4.07     5.7625   6.847917  6.413750  0.434166667  7.193333
## [157,]   3.09     5.3600   5.392292  5.379375  0.012916667  6.847917
## [158,]   3.35     5.2025   4.640000  4.865000 -0.225000000  5.392292
## [159,]   3.44     3.4875   1.044792  2.021875 -0.977083333  4.640000
## [160,]   4.15     3.5075   2.037708  2.625625 -0.587916667  1.044792
## [161,]   3.09     3.5075   2.809583  3.088750 -0.279166667  2.037708
## [162,]   4.04     3.6800   3.903958  3.814375  0.089583333  2.809583
## [163,]   5.94     4.3050   5.230000  4.860000  0.370000000  3.903958
## [164,]   5.42     4.6225   5.612083  5.216250  0.395833333  5.230000
## [165,]   3.70     4.7750   5.490625  5.204375  0.286250000  5.612083
## [166,]   3.48     4.6350   4.719375  4.685625  0.033750000  5.490625
## [167,]   4.00     4.1500   3.490625  3.754375 -0.263750000  4.719375
## [168,]   5.70     4.2200   3.845000  3.995000 -0.150000000  3.490625
## [169,]   4.35     4.3825   4.441875  4.418125  0.023750000  3.845000
## [170,]   6.40     5.1125   6.189583  5.758750  0.430833333  4.441875
## [171,]   4.90     5.3375   6.294792  5.911875  0.382916667  6.189583
## [172,]   4.30     4.9875   5.041667  5.020000  0.021666667  6.294792
## [173,]   3.59     4.7975   4.362083  4.536250 -0.174166667  5.041667
## [174,]   3.86     4.1625   3.064583  3.503750 -0.439166667  4.362083
## [175,]   5.02     4.1925   3.621667  3.850000 -0.228333333  3.064583
## [176,]   3.81     4.0700   3.677292  3.834375 -0.157083333  3.621667
## [177,]   2.77     3.8650   3.519167  3.657500 -0.138333333  3.677292
## [178,]   2.40     3.5000   2.821875  3.093125 -0.271250000  3.519167
## [179,]   3.45     3.1075   2.227292  2.579375 -0.352083333  2.821875
## [180,]   6.30     3.7300   4.028958  3.909375  0.119583333  2.227292
## [181,]   3.85     4.0000   4.692708  4.415625  0.277083333  4.028958
## [182,]   3.84     4.3600   5.294375  4.920625  0.373750000  4.692708
## [183,]   6.18     5.0425   6.308125  5.801875  0.506250000  5.294375
## [184,]   4.70     4.6425   4.861250  4.773750  0.087500000  6.308125
## [185,]   2.53     4.3125   3.851042  4.035625 -0.184583333  4.861250
## [186,]   2.20     3.9025   2.948333  3.330000 -0.381666667  3.851042
## [187,]   3.21     3.1600   1.752708  2.315625 -0.562916667  2.948333
## [188,]   4.14     3.0200   2.055417  2.441250 -0.385833333  1.752708
## [189,]   4.29     3.4600   3.583958  3.534375  0.049583333  2.055417
## [190,]   3.17     3.7025   4.313958  4.069375  0.244583333  3.583958
## [191,]   3.48     3.7700   4.239792  4.051875  0.187916667  4.313958
## [192,]   2.18     3.2800   2.824792  3.006875 -0.182083333  4.239792
## [193,]   2.38     2.8025   1.825417  2.216250 -0.390833333  2.824792
## [194,]   2.64     2.6700   1.902292  2.209375 -0.307083333  1.825417
## [195,]   3.40     2.6500   2.315625  2.449375 -0.133750000  1.902292
## [196,]   3.21     2.9075   3.157500  3.057500  0.100000000  2.315625
## [197,]   2.65     2.9750   3.265625  3.149375  0.116250000  3.157500
## [198,]   3.03     3.0725   3.357917  3.243750  0.114166667  3.265625
## [199,]   5.15     3.5100   4.166250  3.903750  0.262500000  3.357917
## [200,]   3.55     3.5950   4.106458  3.901875  0.204583333  4.166250
## [201,]   3.15     3.7200   4.129375  3.965625  0.163750000  4.106458
## [202,]   3.25     3.7750   3.983333  3.900000  0.083333333  4.129375
## [203,]   3.83     3.4450   3.130417  3.256250 -0.125833333  3.983333
## [204,]   4.05     3.5700   3.474167  3.512500 -0.038333333  3.130417
## [205,]   3.20     3.5825   3.564792  3.571875 -0.007083333  3.474167
## [206,]   2.73     3.4525   3.352500  3.392500 -0.040000000  3.564792
## [207,]   4.45     3.6075   3.698125  3.661875  0.036250000  3.352500
## [208,]   6.37     4.1875   4.987500  4.667500  0.320000000  3.698125
## [209,]   4.88     4.6075   5.680417  5.251250  0.429166667  4.987500
## [210,]   7.26     5.7400   7.747292  6.944375  0.802916667  5.680417
## [211,]   4.55     5.7650   6.915000  6.455000  0.460000000  7.747292
## [212,]   2.30     4.7475   3.968333  4.280000 -0.311666667  6.915000
## [213,]   4.92     4.7575   3.932500  4.262500 -0.330000000  3.968333
## [214,]   3.91     3.9200   2.457500  3.042500 -0.585000000  3.932500
## [215,]   6.88     4.5025   4.536875  4.523125  0.013750000  2.457500
## [216,]   4.02     4.9325   5.606458  5.336875  0.269583333  4.536875
## [217,]   2.43     4.3100   4.132917  4.203750 -0.070833333  5.606458
## [218,]   2.97     4.0750   3.441667  3.695000 -0.253333333  4.132917
## [219,]   4.07     3.3725   2.039167  2.572500 -0.533333333  3.441667
## [220,]   8.31     4.4450   5.102292  4.839375  0.262916667  2.039167
## [221,]  14.21     7.3900  11.672292  9.959375  1.712916667  5.102292
## [222,]   9.35     8.9850  13.879792 11.921875  1.957916667 11.672292
## [223,]   3.04     8.7275  10.961875 10.068125  0.893750000 13.879792
## [224,]   3.66     7.5650   6.561875  6.963125 -0.401250000 10.961875
## [225,]   6.73     5.6950   2.281458  3.646875 -1.365416667  6.561875
## [226,]  11.66     6.2725   4.951667  5.480000 -0.528333333  2.281458
## [227,]   4.63     6.6700   6.868958  6.789375  0.079583333  4.951667
## [228,]   4.59     6.9025   7.765000  7.420000  0.345000000  6.868958
## [229,]   4.80     6.4200   6.176250  6.273750 -0.097500000  7.765000
## [230,]  10.56     6.1450   5.496042  5.755625 -0.259583333  6.176250
## [231,]   8.08     7.0075   7.655417  7.396250  0.259166667  5.496042
## [232,]   6.91     7.5875   8.916667  8.385000  0.531666667  7.655417
## [233,]   6.12     7.9175   9.172708  8.670625  0.502083333  8.916667
## [234,]   6.40     6.8775   6.094167  6.407500 -0.313333333  9.172708
## [235,]   3.09     5.6300   3.341458  4.256875 -0.915416667  6.094167
## [236,]   4.66     5.0675   2.891458  3.761875 -0.870416667  3.341458
## [237,]   5.07     4.8050   3.488333  4.015000 -0.526666667  2.891458
## [238,]   3.00     3.9550   2.439375  3.045625 -0.606250000  3.488333
## [239,]   3.84     4.1425   3.559167  3.792500 -0.233333333  2.439375
## [240,]   2.26     3.5425   2.594583  2.973750 -0.379166667  3.559167
## [241,]   4.73     3.4575   2.929375  3.140625 -0.211250000  2.594583
## [242,]   3.61     3.6100   3.479792  3.531875 -0.052083333  2.929375
## [243,]   4.03     3.6575   3.808542  3.748125  0.060416667  3.479792
## [244,]   3.64     4.0025   4.536875  4.323125  0.213750000  3.808542
## [245,]   5.20     4.1200   4.574167  4.392500  0.181666667  4.536875
## [246,]   4.20     4.2675   4.693542  4.523125  0.170416667  4.574167
## [247,]   4.57     4.4025   4.743125  4.606875  0.136250000  4.693542
## [248,]   5.80     4.9425   5.791458  5.451875  0.339583333  4.743125
## [249,]   6.17     5.1850   5.994375  5.670625  0.323750000  5.791458
## [250,]   5.55     5.5225   6.371458  6.031875  0.339583333  5.994375
## [251,]   3.96     5.3700   5.561667  5.485000  0.076666667  6.371458
## [252,]   3.16     4.7100   3.898542  4.223125 -0.324583333  5.561667
## [253,]   4.20     4.2175   2.988333  3.480000 -0.491666667  3.898542
## [254,]   3.14     3.6150   2.176458  2.751875 -0.575416667  2.988333
## [255,]   3.37     3.4675   2.575833  2.932500 -0.356666667  2.176458
## [256,]   4.16     3.7175   3.656042  3.680625 -0.024583333  2.575833
## [257,]   2.80     3.3675   3.076875  3.193125 -0.116250000  3.656042
## [258,]   4.00     3.5825   3.663750  3.631250  0.032500000  3.076875
## [259,]   5.57     4.1325   4.853333  4.565000  0.288333333  3.663750
## [260,]   4.17     4.1350   4.686042  4.465625  0.220416667  4.853333
## [261,]   8.38     5.5300   7.505000  6.715000  0.790000000  4.686042
## [262,]   7.58     6.4250   8.707292  7.794375  0.912916667  7.505000
## [263,]   5.04     6.2925   7.453958  6.989375  0.464583333  8.707292
## [264,]   3.41     6.1025   6.127500  6.117500  0.010000000  7.453958
## [265,]   5.58     5.4025   4.313958  4.749375 -0.435416667  6.127500
## [266,]   4.63     4.6650   3.080625  3.714375 -0.633750000  4.313958
## [267,]   3.17     4.1975   2.706875  3.303125 -0.596250000  3.080625
## [268,]   6.35     4.9325   5.154375  5.065625  0.088750000  2.706875
## [269,]   4.64     4.6975   4.821458  4.771875  0.049583333  5.154375
## [270,]   5.91     5.0175   5.527917  5.323750  0.204166667  4.821458
## [271,]   5.02     5.4800   6.226875  5.928125  0.298750000  5.527917
## [272,]   3.68     4.8125   4.496875  4.623125 -0.126250000  6.226875
## [273,]   2.73     4.3350   3.374583  3.758750 -0.384166667  4.496875
## [274,]  12.77     6.0500   7.517708  6.930625  0.587083333  3.374583
## [275,]   4.93     6.0275   7.229583  6.748750  0.480833333  7.517708
## [276,]   4.16     6.1475   6.993333  6.655000  0.338333333  7.229583
## [277,]   5.91     6.9425   8.026875  7.593125  0.433750000  6.993333
## [278,]   4.65     4.9125   3.087500  3.817500 -0.730000000  8.026875
## [279,]   5.62     5.0850   3.940208  4.398125 -0.457916667  3.087500
## [280,]   4.16     5.0850   4.382917  4.663750 -0.280833333  3.940208
## [281,]   4.96     4.8475   4.622500  4.712500 -0.090000000  4.382917
## [282,]   5.21     4.9875   4.964583  4.973750 -0.009166667  4.622500
## [283,]   4.40     4.6825   4.318958  4.464375 -0.145416667  4.964583
## [284,]   5.83     5.1000   5.426042  5.295625  0.130416667  4.318958
## [285,]   5.84     5.3200   5.815833  5.617500  0.198333333  5.426042
## [286,]   5.33     5.3500   5.744792  5.586875  0.157916667  5.815833
## [287,]   3.55     5.1375   4.988542  5.048125 -0.059583333  5.744792
## [288,]   2.42     4.2850   3.054792  3.546875 -0.492083333  4.988542
## [289,]   4.98     4.0700   3.002292  3.429375 -0.427083333  3.054792
## [290,]   8.10     4.7625   5.093750  4.961250  0.132500000  3.002292
## [291,]   6.47     5.4925   6.892500  6.332500  0.560000000  5.093750
## [292,]   4.53     6.0200   7.576250  6.953750  0.622500000  6.892500
## [293,]     NA         NA         NA        NA           NA  7.576250
## [294,]     NA         NA         NA        NA           NA  8.198750
## [295,]     NA         NA         NA        NA           NA  8.821250
## [296,]     NA         NA         NA        NA           NA  9.443750
## [297,]     NA         NA         NA        NA           NA 10.066250
## [298,]     NA         NA         NA        NA           NA 10.688750
## [299,]     NA         NA         NA        NA           NA 11.311250
## [300,]     NA         NA         NA        NA           NA 11.933750
## [301,]     NA         NA         NA        NA           NA 12.556250
## [302,]     NA         NA         NA        NA           NA 13.178750
## [303,]     NA         NA         NA        NA           NA 13.801250
## [304,]     NA         NA         NA        NA           NA 14.423750
## [305,]     NA         NA         NA        NA           NA 15.046250
## [306,]     NA         NA         NA        NA           NA 15.668750
## [307,]     NA         NA         NA        NA           NA 16.291250
## [308,]     NA         NA         NA        NA           NA 16.913750
## [309,]     NA         NA         NA        NA           NA 17.536250
## [310,]     NA         NA         NA        NA           NA 18.158750
## [311,]     NA         NA         NA        NA           NA 18.781250
## [312,]     NA         NA         NA        NA           NA 19.403750
## [313,]     NA         NA         NA        NA           NA 20.026250
## [314,]     NA         NA         NA        NA           NA 20.648750
## [315,]     NA         NA         NA        NA           NA 21.271250
## [316,]     NA         NA         NA        NA           NA 21.893750

Hasil pemulusan menggunakan metode DMA divisualisasikan sebagai berikut

ts.plot(data1.ts, xlab="Time Period ", ylab="Wind Speed", main= "DMA N=4 Wind Speed")
points(data1.ts)
lines(data.gab2[,3],col="green",lwd=2)
lines(data.gab2[,6],col="red",lwd=2)
legend("topleft",c("data aktual","data pemulusan","data peramalan"), lty=8, col=c("black","green","red"), cex=0.8)

Selanjutnya perhitungan akurasi dilakukan baik pada data latih maupun data uji. Perhitungan akurasi dilakukan dengan ukuran akurasi SSE, MSE dan MAPE.

#Menghitung nilai keakuratan data latih
error_train.dma = train_ma.ts-data.ramal2[1:length(train_ma.ts)]
SSE_train.dma = sum(error_train.dma[8:length(train_ma.ts)]^2)
MSE_train.dma = mean(error_train.dma[8:length(train_ma.ts)]^2)
MAPE_train.dma = mean(abs((error_train.dma[8:length(train_ma.ts)]/train_ma.ts[8:length(train_ma.ts)])*100))

akurasi_train.dma <- matrix(c(SSE_train.dma, MSE_train.dma, MAPE_train.dma))
row.names(akurasi_train.dma)<- c("SSE", "MSE", "MAPE")
colnames(akurasi_train.dma) <- c("Akurasi m = 4")
akurasi_train.dma
##      Akurasi m = 4
## SSE    2226.294858
## MSE       7.811561
## MAPE     42.635479

Perhitungan akurasi pada data latih menggunakan nilai MAPE menghasilkan nilai MAPE yang lebih dari 10% sehingga dikategorikan tidak baik. Selanjutnya, perhitungan nilai akurasi dilakukan pada data uji.

#Menghitung nilai keakuratan data uji
error_test.dma = test_ma.ts-data.gab2[97:120,6]
## Warning in `-.default`(test_ma.ts, data.gab2[97:120, 6]): longer object length
## is not a multiple of shorter object length
SSE_test.dma = sum(error_test.dma^2)
MSE_test.dma = mean(error_test.dma^2)
MAPE_test.dma = mean(abs((error_test.dma/test_ma.ts*100)))

akurasi_test.dma <- matrix(c(SSE_test.dma, MSE_test.dma, MAPE_test.dma))
row.names(akurasi_test.dma)<- c("SSE", "MSE", "MAPE")
colnames(akurasi_test.dma) <- c("Akurasi m = 4")
akurasi_test.dma
##      Akurasi m = 4
## SSE     424.569339
## MSE       5.816018
## MAPE     41.562018

Perhitungan akurasi menggunakan data latih menghasilkan nilai MAPE yang masih lebih dari 10% sehingga nilai akurasi ini dapat dikategorikan sebagai tidak baik.

Baik pada data latih maupun data uji, terlihat bahwa keduanya menghasilkan nilai yang lebih baik dengan metode SMA dibandingkan dengan metode DMA.

Single Exponential Smoothing & Double Exponential Smoothing

Metode Exponential Smoothing adalah metode pemulusan dengan melakukan pembobotan menurun secara eksponensial. Nilai yang lebih baru diberi bobot yang lebih besar dari nilai terdahulu. Terdapat satu atau lebih parameter pemulusan yang ditentukan secara eksplisit, dan hasil pemilihan parameter tersebut akan menentukan bobot yang akan diberikan pada nilai pengamatan. Ada dua macam model, yaitu model tunggal dan ganda.

Pembagian Data

Pembagian data latih dan data uji dilakukan dengan perbandingan 80% data latih dan 20% data uji.

#membagi training dan testing
training<-data[1:292,]
testing<-data[293:365,]
train.ts <- ts(training$WS50M)
test.ts <- ts(testing$WS50M)

Eksplorasi

Eksplorasi dilakukan dengan membuat plot data deret waktu untuk keseluruhan data, data latih, dan data uji.

#eksplorasi data
plot(data1.ts, col="black",main="Plot semua data")
points(data1.ts)

plot(train.ts, col="red",main="Plot data latih")
points(train.ts)

plot(test.ts, col="blue",main="Plot data uji")
points(test.ts)

SES

Single Exponential Smoothing merupakan metode pemulusan yang tepat digunakan untuk data dengan pola stasioner atau konstan.

Nilai pemulusan pada periode ke-t didapat dari persamaan:

\[ \tilde{y}_T=\lambda y_t+(1-\lambda)\tilde{y}_{T-1} \]

Nilai parameter \(\lambda\) adalah nilai antara 0 dan 1.

Nilai pemulusan periode ke-t bertindak sebagai nilai ramalan pada periode ke-\((T+\tau)\).

\[ \tilde{y}_{T+\tau}(T)=\tilde{y}_T \]

Pemulusan dengan metode SES dapat dilakukan dengan dua fungsi dari packages berbeda, yaitu (1) fungsi ses() dari packages forecast dan (2) fungsi HoltWinters dari packages stats .

#Cara 1 (fungsi ses)
ses.1 <- ses(train.ts, h = 73, alpha = 0.2)
plot(ses.1)

ses.1
##     Point Forecast        Lo 80     Hi 80       Lo 95     Hi 95
## 293       5.310594  2.500011524  8.121177  1.01217896  9.609009
## 294       5.310594  2.444351019  8.176837  0.92705357  9.694135
## 295       5.310594  2.389751006  8.231437  0.84355006  9.777638
## 296       5.310594  2.336153084  8.285035  0.76157911  9.859609
## 297       5.310594  2.283504022  8.337684  0.68105932  9.940129
## 298       5.310594  2.231755144  8.389433  0.60191624 10.019272
## 299       5.310594  2.180861795  8.440326  0.52408158 10.097107
## 300       5.310594  2.130782898  8.490405  0.44749251 10.173696
## 301       5.310594  2.081480559  8.539708  0.37209109 10.249097
## 302       5.310594  2.032919735  8.588269  0.29782372 10.323365
## 303       5.310594  1.985067943  8.636120  0.22464072 10.396548
## 304       5.310594  1.937895002  8.683293  0.15249593 10.468692
## 305       5.310594  1.891372817  8.729815  0.08134639 10.539842
## 306       5.310594  1.845475175  8.775713  0.01115200 10.610036
## 307       5.310594  1.800177580  8.821011 -0.05812469 10.679313
## 308       5.310594  1.755457095  8.865731 -0.12651877 10.747707
## 309       5.310594  1.711292208  8.909896 -0.19406314 10.815251
## 310       5.310594  1.667662713  8.953526 -0.26078869 10.881977
## 311       5.310594  1.624549598  8.996639 -0.32672451 10.947913
## 312       5.310594  1.581934951  9.039253 -0.39189799 11.013086
## 313       5.310594  1.539801871  9.081386 -0.45633497 11.077523
## 314       5.310594  1.498134394  9.123054 -0.52005988 11.141248
## 315       5.310594  1.456917415  9.164271 -0.58309581 11.204284
## 316       5.310594  1.416136632  9.205052 -0.64546464 11.266653
## 317       5.310594  1.375778481  9.245410 -0.70718710 11.328375
## 318       5.310594  1.335830090  9.285358 -0.76828289 11.389471
## 319       5.310594  1.296279225  9.324909 -0.82877072 11.449959
## 320       5.310594  1.257114250  9.364074 -0.88866838 11.509857
## 321       5.310594  1.218324085  9.402864 -0.94799282 11.569181
## 322       5.310594  1.179898171  9.441290 -1.00676018 11.627948
## 323       5.310594  1.141826435  9.479362 -1.06498588 11.686174
## 324       5.310594  1.104099262  9.517089 -1.12268461 11.743873
## 325       5.310594  1.066707461  9.554481 -1.17987044 11.801059
## 326       5.310594  1.029642245  9.591546 -1.23655679 11.857745
## 327       5.310594  0.992895203  9.628293 -1.29275654 11.913945
## 328       5.310594  0.956458279  9.664730 -1.34848201 11.969670
## 329       5.310594  0.920323752  9.700865 -1.40374500 12.024933
## 330       5.310594  0.884484215  9.736704 -1.45855684 12.079745
## 331       5.310594  0.848932561  9.772256 -1.51292841 12.134117
## 332       5.310594  0.813661960  9.807526 -1.56687013 12.188058
## 333       5.310594  0.778665851  9.842522 -1.62039206 12.241580
## 334       5.310594  0.743937923  9.877250 -1.67350385 12.294692
## 335       5.310594  0.709472104  9.911716 -1.72621477 12.347403
## 336       5.310594  0.675262548  9.945926 -1.77853377 12.399722
## 337       5.310594  0.641303621  9.979885 -1.83046947 12.451658
## 338       5.310594  0.607589894 10.013598 -1.88203016 12.503218
## 339       5.310594  0.574116132 10.047072 -1.93322386 12.554412
## 340       5.310594  0.540877282 10.080311 -1.98405830 12.605247
## 341       5.310594  0.507868467 10.113320 -2.03454092 12.655729
## 342       5.310594  0.475084976 10.146103 -2.08467895 12.705867
## 343       5.310594  0.442522257 10.178666 -2.13447933 12.755668
## 344       5.310594  0.410175908 10.211012 -2.18394880 12.805137
## 345       5.310594  0.378041674 10.243147 -2.23309387 12.854282
## 346       5.310594  0.346115435 10.275073 -2.28192084 12.903109
## 347       5.310594  0.314393204 10.306795 -2.33043580 12.951624
## 348       5.310594  0.282871119 10.338317 -2.37864467 12.999833
## 349       5.310594  0.251545438 10.369643 -2.42655317 13.047741
## 350       5.310594  0.220412537 10.400776 -2.47416683 13.095355
## 351       5.310594  0.189468899 10.431719 -2.52149104 13.142679
## 352       5.310594  0.158711114 10.462477 -2.56853102 13.189719
## 353       5.310594  0.128135872 10.493052 -2.61529181 13.236480
## 354       5.310594  0.097739962 10.523448 -2.66177835 13.282967
## 355       5.310594  0.067520264 10.553668 -2.70799538 13.329184
## 356       5.310594  0.037473749 10.583715 -2.75394756 13.375136
## 357       5.310594  0.007597474 10.613591 -2.79963938 13.420828
## 358       5.310594 -0.022111423 10.643300 -2.84507522 13.466263
## 359       5.310594 -0.051655724 10.672844 -2.89025933 13.511448
## 360       5.310594 -0.081038135 10.702226 -2.93519585 13.556384
## 361       5.310594 -0.110261288 10.731450 -2.97988881 13.601077
## 362       5.310594 -0.139327746 10.760516 -3.02434212 13.645530
## 363       5.310594 -0.168240001 10.789428 -3.06855959 13.689748
## 364       5.310594 -0.197000483 10.818189 -3.11254496 13.733733
## 365       5.310594 -0.225611556 10.846800 -3.15630182 13.777490
ses.2<- ses(train.ts, h = 73, alpha = 0.7)
plot(ses.2)

ses.2
##     Point Forecast       Lo 80     Hi 80       Lo 95    Hi 95
## 293       5.157854   1.9808702  8.334839   0.2990764 10.01663
## 294       5.157854   1.2798510  9.035858  -0.7730400 11.08875
## 295       5.157854   0.6874414  9.628267  -1.6790523 11.99476
## 296       5.157854   0.1648318 10.150877  -2.4783146 12.79402
## 297       5.157854  -0.3080358 10.623745  -3.2015033 13.51721
## 298       5.157854  -0.7431318 11.058841  -3.8669252 14.18263
## 299       5.157854  -1.1482791 11.463988  -4.4865445 14.80225
## 300       5.157854  -1.5289237 11.844632  -5.0686902 15.38440
## 301       5.157854  -1.8890374 12.204746  -5.6194365 15.93515
## 302       5.157854  -2.2316223 12.547331  -6.1433749 16.45908
## 303       5.157854  -2.5590133 12.874722  -6.6440763 16.95979
## 304       5.157854  -2.8730689 13.188778  -7.1243829 17.44009
## 305       5.157854  -3.1752969 13.491006  -7.5866008 17.90231
## 306       5.157854  -3.4669409 13.782650  -8.0326317 18.34834
## 307       5.157854  -3.7490404 14.064749  -8.4640657 18.77977
## 308       5.157854  -4.0224755 14.338184  -8.8822486 19.19796
## 309       5.157854  -4.2879986 14.603707  -9.2883311 19.60404
## 310       5.157854  -4.5462592 14.861968  -9.6833066 19.99902
## 311       5.157854  -4.7978225 15.113531 -10.0680395 20.38375
## 312       5.157854  -5.0431840 15.358893 -10.4432875 20.75900
## 313       5.157854  -5.2827809 15.598490 -10.8097194 21.12543
## 314       5.157854  -5.5170015 15.832710 -11.1679288 21.48364
## 315       5.157854  -5.7461921 16.061901 -11.5184456 21.83415
## 316       5.157854  -5.9706636 16.286372 -11.8617451 22.17745
## 317       5.157854  -6.1906959 16.506405 -12.1982555 22.51396
## 318       5.157854  -6.4065425 16.722251 -12.5283644 22.84407
## 319       5.157854  -6.6184335 16.934142 -12.8524238 23.16813
## 320       5.157854  -6.8265788 17.142288 -13.1707545 23.48646
## 321       5.157854  -7.0311702 17.346879 -13.4836501 23.79936
## 322       5.157854  -7.2323838 17.548093 -13.7913797 24.10709
## 323       5.157854  -7.4303816 17.746090 -14.0941912 24.40990
## 324       5.157854  -7.6253129 17.941022 -14.3923129 24.70802
## 325       5.157854  -7.8173161 18.133025 -14.6859564 25.00167
## 326       5.157854  -8.0065191 18.322228 -14.9753175 25.29103
## 327       5.157854  -8.1930412 18.508750 -15.2605784 25.57629
## 328       5.157854  -8.3769930 18.692702 -15.5419084 25.85762
## 329       5.157854  -8.5584781 18.874187 -15.8194659 26.13517
## 330       5.157854  -8.7375930 19.053302 -16.0933985 26.40911
## 331       5.157854  -8.9144283 19.230137 -16.3638448 26.67955
## 332       5.157854  -9.0890689 19.404778 -16.6309344 26.94664
## 333       5.157854  -9.2615944 19.577303 -16.8947895 27.21050
## 334       5.157854  -9.4320800 19.747789 -17.1555247 27.47123
## 335       5.157854  -9.6005963 19.916305 -17.4132482 27.72896
## 336       5.157854  -9.7672101 20.082919 -17.6680619 27.98377
## 337       5.157854  -9.9319843 20.247693 -17.9200623 28.23577
## 338       5.157854 -10.0949786 20.410687 -18.1693406 28.48505
## 339       5.157854 -10.2562494 20.571958 -18.4159830 28.73169
## 340       5.157854 -10.4158503 20.731559 -18.6600715 28.97578
## 341       5.157854 -10.5738321 20.889541 -18.9016838 29.21739
## 342       5.157854 -10.7302431 21.045952 -19.1408938 29.45660
## 343       5.157854 -10.8851292 21.200838 -19.3777717 29.69348
## 344       5.157854 -11.0385342 21.354243 -19.6123845 29.92809
## 345       5.157854 -11.1904998 21.506209 -19.8447958 30.16050
## 346       5.157854 -11.3410658 21.656775 -20.0750666 30.39078
## 347       5.157854 -11.4902701 21.805979 -20.3032549 30.61896
## 348       5.157854 -11.6381491 21.953858 -20.5294162 30.84512
## 349       5.157854 -11.7847373 22.100446 -20.7536036 31.06931
## 350       5.157854 -11.9300681 22.245777 -20.9758679 31.29158
## 351       5.157854 -12.0741733 22.389882 -21.1962577 31.51197
## 352       5.157854 -12.2170833 22.532792 -21.4148197 31.73053
## 353       5.157854 -12.3588274 22.674536 -21.6315986 31.94731
## 354       5.157854 -12.4994337 22.815142 -21.8466374 32.16235
## 355       5.157854 -12.6389292 22.954638 -22.0599773 32.37569
## 356       5.157854 -12.7773397 23.093048 -22.2716579 32.58737
## 357       5.157854 -12.9146902 23.230399 -22.4817174 32.79743
## 358       5.157854 -13.0510047 23.366713 -22.6901925 33.00590
## 359       5.157854 -13.1863063 23.502015 -22.8971184 33.21283
## 360       5.157854 -13.3206173 23.636326 -23.1025293 33.41824
## 361       5.157854 -13.4539590 23.769668 -23.3064578 33.62217
## 362       5.157854 -13.5863521 23.902061 -23.5089357 33.82464
## 363       5.157854 -13.7178167 24.033525 -23.7099934 34.02570
## 364       5.157854 -13.8483720 24.164081 -23.9096605 34.22537
## 365       5.157854 -13.9780366 24.293745 -24.1079654 34.42367

Untuk mendapatkan gambar hasil pemulusan pada data latih dengan fungsi ses() , perlu digunakan fungsi autoplot() dan autolayer() dari library packages ggplot2 .

autoplot(ses.1) +
  autolayer(fitted(ses.1), series="Fitted")

Pada fungsi ses() , terdapat beberapa argumen yang umum digunakan, yaitu nilia y , gamma , beta , alpha , dan h .

Nilai y adalah nilai data deret waktu, gamma adalah parameter pemulusan untuk komponen musiman, beta adalah parameter pemulusan untuk tren, dan alpha adalah parameter pemulusan untuk stasioner, serta h adalah banyaknya periode yang akan diramalkan.

Kasus di atas merupakan contoh inisialisasi nilai parameter \(\lambda\) dengan nilai alpha 0,2 dan 0,7 dan banyak periode data yang akan diramalkan adalah sebanyak 10 periode. Selanjutnya akan digunakan fungsi HoltWinters() dengan nilai inisialisasi parameter dan panjang periode peramalan yang sama dengan fungsi ses() .

#Cara 2 (fungsi Holtwinter)
ses1<- HoltWinters(train.ts, gamma = FALSE, beta = FALSE, alpha = 0.2)
plot(ses1)

#ramalan
ramalan1<- forecast(ses1, h=73)
ramalan1
##     Point Forecast        Lo 80     Hi 80        Lo 95     Hi 95
## 293       5.310594  2.497525359  8.123663  1.008376704  9.612812
## 294       5.310594  2.441815619  8.179373  0.923176009  9.698012
## 295       5.310594  2.387167308  8.234021  0.839598630  9.781590
## 296       5.310594  2.333521974  8.287666  0.757555172  9.863633
## 297       5.310594  2.280826341  8.340362  0.676964155  9.944224
## 298       5.310594  2.229031687  8.392157  0.597751067 10.023437
## 299       5.310594  2.178093319  8.443095  0.519847557 10.101341
## 300       5.310594  2.127970124  8.493218  0.443190745 10.177998
## 301       5.310594  2.078624173  8.542564  0.367722626 10.253466
## 302       5.310594  2.030020394  8.591168  0.293389560 10.327799
## 303       5.310594  1.982126273  8.639062  0.220141823 10.401046
## 304       5.310594  1.934911605  8.686277  0.147933219 10.473255
## 305       5.310594  1.888348267  8.732840  0.076720739 10.544468
## 306       5.310594  1.842410025  8.778778  0.006464261 10.614724
## 307       5.310594  1.797072361  8.824116 -0.062873713 10.684062
## 308       5.310594  1.752312318  8.868876 -0.131328292 10.752517
## 309       5.310594  1.708108364  8.913080 -0.198932405 10.820121
## 310       5.310594  1.664440276  8.956748 -0.265716983 10.886905
## 311       5.310594  1.621289024  8.999899 -0.331711128 10.952899
## 312       5.310594  1.578636681  9.042552 -0.396942257 11.018131
## 313       5.310594  1.536466332  9.084722 -0.461436241 11.082625
## 314       5.310594  1.494761996  9.126426 -0.525217518 11.146406
## 315       5.310594  1.453508558  9.167680 -0.588309208 11.209497
## 316       5.310594  1.412691701  9.208497 -0.650733203 11.271921
## 317       5.310594  1.372297851  9.248890 -0.712510266 11.333699
## 318       5.310594  1.332314122  9.288874 -0.773660102 11.394848
## 319       5.310594  1.292728271  9.328460 -0.834201436 11.455390
## 320       5.310594  1.253528652  9.367660 -0.894152081 11.515340
## 321       5.310594  1.214704174  9.406484 -0.953528996 11.574717
## 322       5.310594  1.176244270  9.444944 -1.012348344 11.633537
## 323       5.310594  1.138138857  9.483049 -1.070625543 11.691814
## 324       5.310594  1.100378311  9.520810 -1.128375314 11.749564
## 325       5.310594  1.062953435  9.558235 -1.185611724 11.806800
## 326       5.310594  1.025855432  9.595333 -1.242348224 11.863536
## 327       5.310594  0.989075884  9.632112 -1.298597687 11.919786
## 328       5.310594  0.952606729  9.668582 -1.354372447 11.975561
## 329       5.310594  0.916440239  9.704748 -1.409684321 12.030873
## 330       5.310594  0.880568999  9.740619 -1.464544647 12.085733
## 331       5.310594  0.844985897  9.776202 -1.518964307 12.140153
## 332       5.310594  0.809684096  9.811504 -1.572953751 12.194142
## 333       5.310594  0.774657031  9.846531 -1.626523025 12.247711
## 334       5.310594  0.739898384  9.881290 -1.679681789 12.300870
## 335       5.310594  0.705402077  9.915786 -1.732439336 12.353628
## 336       5.310594  0.671162260  9.950026 -1.784804619 12.405993
## 337       5.310594  0.637173294  9.984015 -1.836786257 12.457975
## 338       5.310594  0.603429745 10.017759 -1.888392561 12.509581
## 339       5.310594  0.569926373 10.051262 -1.939631548 12.560820
## 340       5.310594  0.536658120 10.084530 -1.990510948 12.611699
## 341       5.310594  0.503620107 10.117568 -2.041038230 12.662226
## 342       5.310594  0.470807617 10.150381 -2.091220603 12.712409
## 343       5.310594  0.438216093 10.182972 -2.141065037 12.762253
## 344       5.310594  0.405841132 10.215347 -2.190578267 12.811767
## 345       5.310594  0.373678473 10.247510 -2.239766810 12.860955
## 346       5.310594  0.341723993 10.279464 -2.288636970 12.909825
## 347       5.310594  0.309973701 10.311215 -2.337194851 12.958383
## 348       5.310594  0.278423732 10.342765 -2.385446365 13.006635
## 349       5.310594  0.247070342 10.374118 -2.433397237 13.054585
## 350       5.310594  0.215909902 10.405278 -2.481053018 13.102241
## 351       5.310594  0.184938891 10.436249 -2.528419090 13.149607
## 352       5.310594  0.154153899 10.467034 -2.575500674 13.196689
## 353       5.310594  0.123551611 10.497637 -2.622302835 13.243491
## 354       5.310594  0.093128813 10.528059 -2.668830489 13.290019
## 355       5.310594  0.062882384 10.558306 -2.715088410 13.336277
## 356       5.310594  0.032809291 10.588379 -2.761081237 13.382269
## 357       5.310594  0.002906588 10.618282 -2.806813474 13.428002
## 358       5.310594 -0.026828589 10.648017 -2.852289503 13.473478
## 359       5.310594 -0.056399024 10.677587 -2.897513581 13.518702
## 360       5.310594 -0.085807426 10.706996 -2.942489851 13.563678
## 361       5.310594 -0.115056429 10.736245 -2.987222341 13.608411
## 362       5.310594 -0.144148598 10.765337 -3.031714973 13.652903
## 363       5.310594 -0.173086428 10.794275 -3.075971565 13.697160
## 364       5.310594 -0.201872350 10.823061 -3.119995835 13.741184
## 365       5.310594 -0.230508733 10.851697 -3.163791402 13.784980
ses2<- HoltWinters(train.ts, gamma = FALSE, beta = FALSE, alpha = 0.7)
plot(ses2)

#ramalan
ramalan2<- forecast(ses2, h=73)
ramalan2
##     Point Forecast       Lo 80     Hi 80       Lo 95    Hi 95
## 293       5.157854   1.9806909  8.335018   0.2988021 10.01691
## 294       5.157854   1.2796321  9.036077  -0.7733748 11.08908
## 295       5.157854   0.6871890  9.628520  -1.6794382 11.99515
## 296       5.157854   0.1645500 10.151159  -2.4787457 12.79445
## 297       5.157854  -0.3083444 10.624053  -3.2019752 13.51768
## 298       5.157854  -0.7434649 11.059174  -3.8674346 14.18314
## 299       5.157854  -1.1486351 11.464344  -4.4870890 14.80280
## 300       5.157854  -1.5293012 11.845010  -5.0692675 15.38498
## 301       5.157854  -1.8894352 12.205144  -5.6200449 15.93575
## 302       5.157854  -2.2320394 12.547748  -6.1440129 16.45972
## 303       5.157854  -2.5594489 12.875158  -6.6447426 16.96045
## 304       5.157854  -2.8735223 13.189231  -7.1250762 17.44078
## 305       5.157854  -3.1757674 13.491476  -7.5873202 17.90303
## 306       5.157854  -3.4674277 13.783136  -8.0333763 18.34909
## 307       5.157854  -3.7495432 14.065252  -8.4648347 18.78054
## 308       5.157854  -4.0229937 14.338702  -8.8830411 19.19875
## 309       5.157854  -4.2885318 14.604241  -9.2891466 19.60486
## 310       5.157854  -4.5468070 14.862516  -9.6841444 19.99985
## 311       5.157854  -4.7983845 15.114093 -10.0688990 20.38461
## 312       5.157854  -5.0437599 15.359469 -10.4441682 20.75988
## 313       5.157854  -5.2833703 15.599079 -10.8106208 21.12633
## 314       5.157854  -5.5176041 15.833313 -11.1688504 21.48456
## 315       5.157854  -5.7468076 16.062516 -11.5193870 21.83510
## 316       5.157854  -5.9712918 16.287001 -11.8627059 22.17841
## 317       5.157854  -6.1913365 16.507045 -12.1992353 22.51494
## 318       5.157854  -6.4071953 16.722904 -12.5293628 22.84507
## 319       5.157854  -6.6190983 16.934807 -12.8534405 23.16915
## 320       5.157854  -6.8272553 17.142964 -13.1717892 23.48750
## 321       5.157854  -7.0318583 17.347567 -13.4847024 23.80041
## 322       5.157854  -7.2330833 17.548792 -13.7924494 24.10816
## 323       5.157854  -7.4310922 17.746801 -14.0952780 24.41099
## 324       5.157854  -7.6260346 17.941743 -14.3934166 24.70913
## 325       5.157854  -7.8180485 18.133757 -14.6870766 25.00279
## 326       5.157854  -8.0072623 18.322971 -14.9764541 25.29216
## 327       5.157854  -8.1937948 18.509504 -15.2617310 25.57744
## 328       5.157854  -8.3777571 18.693466 -15.5430769 25.85879
## 329       5.157854  -8.5592524 18.874961 -15.8206501 26.13636
## 330       5.157854  -8.7383774 19.054086 -16.0945982 26.41031
## 331       5.157854  -8.9152227 19.230931 -16.3650597 26.68077
## 332       5.157854  -9.0898731 19.405582 -16.6321644 26.94787
## 333       5.157854  -9.2624084 19.578117 -16.8960344 27.21174
## 334       5.157854  -9.4329036 19.748612 -17.1567843 27.47249
## 335       5.157854  -9.6014295 19.917138 -17.4145224 27.73023
## 336       5.157854  -9.7680526 20.083761 -17.6693505 27.98506
## 337       5.157854  -9.9328361 20.248545 -17.9213651 28.23707
## 338       5.157854 -10.0958396 20.411548 -18.1706574 28.48637
## 339       5.157854 -10.2571195 20.572828 -18.4173138 28.73302
## 340       5.157854 -10.4167294 20.732438 -18.6614161 28.97712
## 341       5.157854 -10.5747201 20.890429 -18.9030420 29.21875
## 342       5.157854 -10.7311400 21.046849 -19.1422655 29.45797
## 343       5.157854 -10.8860348 21.201744 -19.3791568 29.69487
## 344       5.157854 -11.0394485 21.355157 -19.6137827 29.92949
## 345       5.157854 -11.1914227 21.507131 -19.8462072 30.16192
## 346       5.157854 -11.3419972 21.657706 -20.0764910 30.39220
## 347       5.157854 -11.4912099 21.806919 -20.3046922 30.62040
## 348       5.157854 -11.6390972 21.954806 -20.5308663 30.84657
## 349       5.157854 -11.7856937 22.101402 -20.7550663 31.07078
## 350       5.157854 -11.9310327 22.246741 -20.9773431 31.29305
## 351       5.157854 -12.0751460 22.390855 -21.1977454 31.51345
## 352       5.157854 -12.2180641 22.533773 -21.4163197 31.73203
## 353       5.157854 -12.3598162 22.675525 -21.6331109 31.94882
## 354       5.157854 -12.5004305 22.816139 -21.8481618 32.16387
## 355       5.157854 -12.6399338 22.955643 -22.0615138 32.37722
## 356       5.157854 -12.7783522 23.094061 -22.2732063 32.58892
## 357       5.157854 -12.9157104 23.231419 -22.4832777 32.79899
## 358       5.157854 -13.0520326 23.367741 -22.6917645 33.00747
## 359       5.157854 -13.1873419 23.503051 -22.8987021 33.21441
## 360       5.157854 -13.3216604 23.637369 -23.1041246 33.41983
## 361       5.157854 -13.4550096 23.770718 -23.3080646 33.62377
## 362       5.157854 -13.5874103 23.903119 -23.5105539 33.82626
## 363       5.157854 -13.7188823 24.034591 -23.7116230 34.02733
## 364       5.157854 -13.8494449 24.165154 -23.9113014 34.22701
## 365       5.157854 -13.9791168 24.294826 -24.1096174 34.42533

Fungsi HoltWinters memiliki argumen yang sama dengan fungsi ses() . Argumen-argumen kedua fungsi dapat dilihat lebih lanjut dengan ?ses() atau ?HoltWinters .

Nilai parameter \(\alpha\) dari kedua fungsi dapat dioptimalkan menyesuaikan dari error-nya paling minimumnya. Caranya adalah dengan membuat parameter \(\alpha =\) NULL .

#SES
ses.opt <- ses(train.ts, h = 73, alpha = NULL)
plot(ses.opt)

ses.opt
##     Point Forecast    Lo 80    Hi 80      Lo 95     Hi 95
## 293       5.182554 2.417079 7.948029 0.95312497  9.411983
## 294       5.182554 2.408163 7.956945 0.93948901  9.425618
## 295       5.182554 2.399275 7.965832 0.92589674  9.439211
## 296       5.182554 2.390416 7.974691 0.91234773  9.452760
## 297       5.182554 2.381585 7.983523 0.89884157  9.466266
## 298       5.182554 2.372781 7.992326 0.88537787  9.479730
## 299       5.182554 2.364006 8.001102 0.87195621  9.493151
## 300       5.182554 2.355257 8.009851 0.85857622  9.506531
## 301       5.182554 2.346535 8.018572 0.84523750  9.519870
## 302       5.182554 2.337840 8.027267 0.83193968  9.533168
## 303       5.182554 2.329172 8.035936 0.81868238  9.546425
## 304       5.182554 2.320529 8.044578 0.80546523  9.559642
## 305       5.182554 2.311913 8.053194 0.79228787  9.572820
## 306       5.182554 2.303323 8.061785 0.77914995  9.585958
## 307       5.182554 2.294758 8.070350 0.76605111  9.599056
## 308       5.182554 2.286218 8.078889 0.75299100  9.612116
## 309       5.182554 2.277704 8.087404 0.73996929  9.625138
## 310       5.182554 2.269214 8.095893 0.72698563  9.638122
## 311       5.182554 2.260749 8.104358 0.71403970  9.651068
## 312       5.182554 2.252309 8.112798 0.70113117  9.663976
## 313       5.182554 2.243893 8.121215 0.68825971  9.676848
## 314       5.182554 2.235501 8.129607 0.67542501  9.689682
## 315       5.182554 2.227132 8.137975 0.66262675  9.702481
## 316       5.182554 2.218788 8.146320 0.64986464  9.715243
## 317       5.182554 2.210466 8.154641 0.63713835  9.727969
## 318       5.182554 2.202168 8.162939 0.62444760  9.740660
## 319       5.182554 2.193893 8.171214 0.61179208  9.753315
## 320       5.182554 2.185641 8.179466 0.59917150  9.765936
## 321       5.182554 2.177412 8.187696 0.58658558  9.778522
## 322       5.182554 2.169205 8.195903 0.57403404  9.791073
## 323       5.182554 2.161020 8.204088 0.56151658  9.803591
## 324       5.182554 2.152857 8.212250 0.54903294  9.816075
## 325       5.182554 2.144717 8.220391 0.53658285  9.828525
## 326       5.182554 2.136598 8.228510 0.52416603  9.840941
## 327       5.182554 2.128500 8.236607 0.51178222  9.853325
## 328       5.182554 2.120424 8.244683 0.49943115  9.865676
## 329       5.182554 2.112370 8.252738 0.48711257  9.877995
## 330       5.182554 2.104336 8.260771 0.47482623  9.890281
## 331       5.182554 2.096323 8.268784 0.46257187  9.902536
## 332       5.182554 2.088332 8.276776 0.45034924  9.914758
## 333       5.182554 2.080360 8.284747 0.43815810  9.926949
## 334       5.182554 2.072409 8.292698 0.42599821  9.939109
## 335       5.182554 2.064479 8.300629 0.41386932  9.951238
## 336       5.182554 2.056568 8.308539 0.40177121  9.963336
## 337       5.182554 2.048677 8.316430 0.38970363  9.975404
## 338       5.182554 2.040807 8.324301 0.37766636  9.987441
## 339       5.182554 2.032956 8.332152 0.36565917  9.999448
## 340       5.182554 2.025124 8.339983 0.35368184 10.011426
## 341       5.182554 2.017312 8.347796 0.34173414 10.023373
## 342       5.182554 2.009519 8.355588 0.32981585 10.035292
## 343       5.182554 2.001745 8.363362 0.31792677 10.047181
## 344       5.182554 1.993990 8.371117 0.30606667 10.059041
## 345       5.182554 1.986254 8.378853 0.29423535 10.070872
## 346       5.182554 1.978537 8.386571 0.28243260 10.082675
## 347       5.182554 1.970838 8.394270 0.27065820 10.094449
## 348       5.182554 1.963157 8.401950 0.25891197 10.106196
## 349       5.182554 1.955495 8.409612 0.24719368 10.117914
## 350       5.182554 1.947851 8.417256 0.23550316 10.129604
## 351       5.182554 1.940225 8.424882 0.22384020 10.141267
## 352       5.182554 1.932617 8.432490 0.21220460 10.152903
## 353       5.182554 1.925027 8.440081 0.20059618 10.164511
## 354       5.182554 1.917454 8.447653 0.18901475 10.176093
## 355       5.182554 1.909899 8.455209 0.17746011 10.187647
## 356       5.182554 1.902361 8.462746 0.16593209 10.199175
## 357       5.182554 1.894841 8.470267 0.15443050 10.210677
## 358       5.182554 1.887337 8.477770 0.14295516 10.222152
## 359       5.182554 1.879851 8.485256 0.13150588 10.233602
## 360       5.182554 1.872382 8.492726 0.12008251 10.245025
## 361       5.182554 1.864929 8.500178 0.10868485 10.256423
## 362       5.182554 1.857493 8.507614 0.09731273 10.267795
## 363       5.182554 1.850074 8.515033 0.08596599 10.279141
## 364       5.182554 1.842671 8.522436 0.07464446 10.290463
## 365       5.182554 1.835285 8.529822 0.06334796 10.301760
#Lamda Optimum Holt Winter
sesopt<- HoltWinters(train.ts, gamma = FALSE, beta = FALSE,alpha = NULL)
sesopt
## Holt-Winters exponential smoothing without trend and without seasonal component.
## 
## Call:
## HoltWinters(x = train.ts, alpha = NULL, beta = FALSE, gamma = FALSE)
## 
## Smoothing parameters:
##  alpha: 0.07840464
##  beta : FALSE
##  gamma: FALSE
## 
## Coefficients:
##       [,1]
## a 5.179494
plot(sesopt)

#ramalan
ramalanopt<- forecast(sesopt, h=73)
ramalanopt
##     Point Forecast    Lo 80    Hi 80      Lo 95     Hi 95
## 293       5.179494 2.411685 7.947303 0.94649594  9.412492
## 294       5.179494 2.403191 7.955797 0.93350515  9.425483
## 295       5.179494 2.394723 7.964265 0.92055398  9.438434
## 296       5.179494 2.386280 7.972708 0.90764208  9.451346
## 297       5.179494 2.377863 7.981125 0.89476908  9.464219
## 298       5.179494 2.369471 7.989517 0.88193465  9.477053
## 299       5.179494 2.361104 7.997884 0.86913843  9.489849
## 300       5.179494 2.352762 8.006226 0.85638009  9.502608
## 301       5.179494 2.344444 8.014544 0.84365929  9.515329
## 302       5.179494 2.336151 8.022837 0.83097570  9.528012
## 303       5.179494 2.327881 8.031106 0.81832900  9.540659
## 304       5.179494 2.319636 8.039352 0.80571886  9.553269
## 305       5.179494 2.311415 8.047573 0.79314498  9.565843
## 306       5.179494 2.303216 8.055772 0.78060704  9.578381
## 307       5.179494 2.295042 8.063946 0.76810474  9.590883
## 308       5.179494 2.286890 8.072098 0.75563776  9.603350
## 309       5.179494 2.278761 8.080227 0.74320583  9.615782
## 310       5.179494 2.270655 8.088333 0.73080863  9.628179
## 311       5.179494 2.262571 8.096416 0.71844588  9.640542
## 312       5.179494 2.254510 8.104478 0.70611731  9.652871
## 313       5.179494 2.246471 8.112517 0.69382261  9.665165
## 314       5.179494 2.238454 8.120534 0.68156152  9.677426
## 315       5.179494 2.230459 8.128529 0.66933377  9.689654
## 316       5.179494 2.222485 8.136503 0.65713907  9.701849
## 317       5.179494 2.214533 8.144455 0.64497717  9.714011
## 318       5.179494 2.206602 8.152386 0.63284781  9.726140
## 319       5.179494 2.198692 8.160296 0.62075071  9.738237
## 320       5.179494 2.190803 8.168185 0.60868563  9.750302
## 321       5.179494 2.182935 8.176053 0.59665232  9.762336
## 322       5.179494 2.175087 8.183901 0.58465051  9.774337
## 323       5.179494 2.167260 8.191728 0.57267998  9.786308
## 324       5.179494 2.159453 8.199535 0.56074047  9.798247
## 325       5.179494 2.151667 8.207321 0.54883174  9.810156
## 326       5.179494 2.143900 8.215088 0.53695356  9.822034
## 327       5.179494 2.136153 8.222835 0.52510570  9.833882
## 328       5.179494 2.128426 8.230562 0.51328791  9.845700
## 329       5.179494 2.120718 8.238270 0.50149999  9.857488
## 330       5.179494 2.113030 8.245958 0.48974169  9.869246
## 331       5.179494 2.105361 8.253627 0.47801279  9.880975
## 332       5.179494 2.097711 8.261277 0.46631309  9.892675
## 333       5.179494 2.090080 8.268908 0.45464236  9.904346
## 334       5.179494 2.082467 8.276521 0.44300038  9.915988
## 335       5.179494 2.074874 8.284114 0.43138695  9.927601
## 336       5.179494 2.067299 8.291689 0.41980185  9.939186
## 337       5.179494 2.059742 8.299246 0.40824489  9.950743
## 338       5.179494 2.052203 8.306784 0.39671585  9.962272
## 339       5.179494 2.044683 8.314305 0.38521453  9.973773
## 340       5.179494 2.037181 8.321807 0.37374074  9.985247
## 341       5.179494 2.029696 8.329292 0.36229428  9.996694
## 342       5.179494 2.022230 8.336758 0.35087495 10.008113
## 343       5.179494 2.014781 8.344207 0.33948257 10.019505
## 344       5.179494 2.007349 8.351639 0.32811693 10.030871
## 345       5.179494 1.999935 8.359053 0.31677787 10.042210
## 346       5.179494 1.992538 8.366450 0.30546518 10.053523
## 347       5.179494 1.985158 8.373830 0.29417869 10.064809
## 348       5.179494 1.977795 8.381193 0.28291821 10.076070
## 349       5.179494 1.970449 8.388539 0.27168357 10.087304
## 350       5.179494 1.963120 8.395868 0.26047459 10.098513
## 351       5.179494 1.955808 8.403180 0.24929109 10.109697
## 352       5.179494 1.948512 8.410476 0.23813290 10.120855
## 353       5.179494 1.941232 8.417756 0.22699985 10.131988
## 354       5.179494 1.933969 8.425019 0.21589178 10.143096
## 355       5.179494 1.926722 8.432266 0.20480850 10.154179
## 356       5.179494 1.919491 8.439497 0.19374987 10.165238
## 357       5.179494 1.912276 8.446712 0.18271570 10.176272
## 358       5.179494 1.905077 8.453911 0.17170586 10.187282
## 359       5.179494 1.897894 8.461094 0.16072016 10.198268
## 360       5.179494 1.890727 8.468261 0.14975846 10.209229
## 361       5.179494 1.883575 8.475413 0.13882059 10.220167
## 362       5.179494 1.876438 8.482550 0.12790641 10.231081
## 363       5.179494 1.869317 8.489671 0.11701576 10.241972
## 364       5.179494 1.862212 8.496776 0.10614849 10.252839
## 365       5.179494 1.855121 8.503867 0.09530445 10.263683

Setelah dilakukan peramalan, akan dilakukan perhitungan keakuratan hasil peramalan. Perhitungan akurasi ini dilakukan baik pada data latih dan data uji.

Akurasi Data Latih

Perhitungan akurasi data dapat dilakukan dengan cara langsung maupun manual. Secara langsung, nilai akurasi dapat diambil dari objek yang tersimpan pada hasil SES, yaitu sum of squared errors (SSE). Nilai akurasi lain dapat dihitung pula dari nilai SSE tersebut.

#Keakuratan Metode
#Pada data training
SSE1<-ses1$SSE
MSE1<-ses1$SSE/length(train.ts)
RMSE1<-sqrt(MSE1)

akurasi1 <- matrix(c(SSE1,MSE1,RMSE1))
row.names(akurasi1)<- c("SSE", "MSE", "RMSE")
colnames(akurasi1) <- c("Akurasi lamda=0.2")
akurasi1
##      Akurasi lamda=0.2
## SSE        1397.296155
## MSE           4.785261
## RMSE          2.187524
SSE2<-ses2$SSE
MSE2<-ses2$SSE/length(train.ts)
RMSE2<-sqrt(MSE2)

akurasi2 <- matrix(c(SSE2,MSE2,RMSE2))
row.names(akurasi2)<- c("SSE", "MSE", "RMSE")
colnames(akurasi2) <- c("Akurasi lamda=0.7")
akurasi2
##      Akurasi lamda=0.7
## SSE        1782.400354
## MSE           6.104111
## RMSE          2.470650
#Cara Manual
fitted1<-ramalan1$fitted
sisaan1<-ramalan1$residuals
head(sisaan1)
## Time Series:
## Start = 1 
## End = 6 
## Frequency = 1 
## [1]        NA -0.990000 -2.332000 -1.965600 -0.962480 -0.389984
resid1<-training$WS50M-ramalan1$fitted
head(resid1)
## Time Series:
## Start = 1 
## End = 6 
## Frequency = 1 
## [1]        NA -0.990000 -2.332000 -1.965600 -0.962480 -0.389984
#Cara Manual
SSE.1=sum(sisaan1[2:length(train.ts)]^2)
SSE.1
## [1] 1397.296
MSE.1 = SSE.1/length(train.ts)
MSE.1
## [1] 4.785261
MAPE.1 = sum(abs(sisaan1[2:length(train.ts)]/train.ts[2:length(train.ts)])*
               100)/length(train.ts)
MAPE.1
## [1] 32.56683
akurasi.1 <- matrix(c(SSE.1,MSE.1,MAPE.1))
row.names(akurasi.1)<- c("SSE", "MSE", "MAPE")
colnames(akurasi.1) <- c("Akurasi lamda=0.2")
akurasi.1
##      Akurasi lamda=0.2
## SSE        1397.296155
## MSE           4.785261
## MAPE         32.566832
fitted2<-ramalan2$fitted
sisaan2<-ramalan2$residuals
head(sisaan2)
## Time Series:
## Start = 1 
## End = 6 
## Frequency = 1 
## [1]        NA -0.990000 -1.837000 -0.651100  0.414670  0.504401
resid2<-training$WS50M-ramalan2$fitted
head(resid2)
## Time Series:
## Start = 1 
## End = 6 
## Frequency = 1 
## [1]        NA -0.990000 -1.837000 -0.651100  0.414670  0.504401
SSE.2=sum(sisaan2[2:length(train.ts)]^2)
SSE.2
## [1] 1782.4
MSE.2 = SSE.2/length(train.ts)
MSE.2
## [1] 6.104111
MAPE.2 = sum(abs(sisaan2[2:length(train.ts)]/train.ts[2:length(train.ts)])*
               100)/length(train.ts)
MAPE.2
## [1] 36.27731
akurasi.2 <- matrix(c(SSE.2,MSE.2,MAPE.2))
row.names(akurasi.2)<- c("SSE", "MSE", "MAPE")
colnames(akurasi.2) <- c("Akurasi lamda=0.7")
akurasi.2
##      Akurasi lamda=0.7
## SSE        1782.400354
## MSE           6.104111
## MAPE         36.277308

Berdasarkan nilai SSE, MSE, RMSE, dan MAPE di antara kedua parameter, nilai parameter \(\lambda=0,2\) menghasilkan akurasi yang lebih baik dibanding \(\lambda=0,7\) . Hal ini dilihat dari nilai masing-masing ukuran akurasi yang lebih kecil. Namun, berdasarkan nilai MAPE-nya, hasil ini dapat dikategorikan sebagai peramalan kurang baik karena lebih dari 10%.

Akurasi Data Uji

Akurasi data uji dapat dihitung dengan cara yang hampir sama dengan perhitungan akurasi data latih.

selisih1<-ramalan1$mean-testing$WS50M
SSEtesting1<-sum(selisih1^2)
MSEtesting1<-SSEtesting1/length(testing)

selisih2<-ramalan2$mean-testing$WS50M
SSEtesting2<-sum(selisih2^2)
MSEtesting2<-SSEtesting2/length(testing)

selisihopt<-ramalanopt$mean-testing$WS50M
SSEtestingopt<-sum(selisihopt^2)
MSEtestingopt<-SSEtestingopt/length(testing)

akurasitesting1 <- matrix(c(SSEtesting1,SSEtesting2,SSEtestingopt))
row.names(akurasitesting1)<- c("SSE1", "SSE2", "SSEopt")
akurasitesting1
##            [,1]
## SSE1   254.7511
## SSE2   259.0008
## SSEopt 258.1916
akurasitesting2 <- matrix(c(MSEtesting1,MSEtesting2,MSEtestingopt))
row.names(akurasitesting2)<- c("MSE1", "MSE2", "MSEopt")
akurasitesting2
##            [,1]
## MSE1   127.3755
## MSE2   129.5004
## MSEopt 129.0958

Selain dengan cara di atas, perhitungan nilai akurasi dapat menggunakan fungsi accuracy() dari package forecast . Penggunaannya yaitu dengan menuliskan accuracy(hasil ramalan, kondisi aktual) . Contohnya adalah sebagai berikut.

#cara lain
accuracy(ramalanopt,testing$WS50M)
##                       ME     RMSE      MAE        MPE     MAPE      MASE
## Training set -0.01755393 2.156090 1.618200 -14.184431 32.82565 0.8004762
## Test set      0.24530057 1.880657 1.518528  -8.584581 32.00398 0.7511714
##                   ACF1
## Training set 0.1676135
## Test set            NA

DES

Metode pemulusan Double Exponential Smoothing (DES) digunakan untuk data yang memiliki pola tren. Metode DES adalah metode semacam SES, hanya saja dilakukan dua kali, yaitu pertama untuk tahapan ‘level’ dan kedua untuk tahapan ‘tren’. Pemulusan menggunakan metode ini akan menghasilkan peramalan tidak konstan untuk periode berikutnya.

Pemulusan dengan metode DES kali ini akan menggunakan fungsi HoltWinters() . Jika sebelumnya nilai argumen beta dibuat FALSE , kali ini argumen tersebut akan diinisialisasi bersamaan dengan nilai alpha .

#Lamda=0.2 dan gamma=0.2
des.1<- HoltWinters(train.ts, gamma = FALSE, beta = 0.2, alpha = 0.2)
plot(des.1)

#ramalan
ramalandes1<- forecast(des.1, h=73)
ramalandes1
##     Point Forecast       Lo 80     Hi 80        Lo 95     Hi 95
## 293       5.227769   2.1864984  8.269040   0.57654678  9.878992
## 294       5.271880   2.1442464  8.399513   0.48857729 10.055182
## 295       5.315990   2.0745032  8.557477   0.35856360 10.273417
## 296       5.360101   1.9756701  8.744531   0.18406090 10.536140
## 297       5.404211   1.8470938  8.961328  -0.03593024 10.844352
## 298       5.448321   1.6889364  9.207706  -0.30116173 11.197804
## 299       5.492432   1.5019875  9.482876  -0.61042604 11.595289
## 300       5.536542   1.2874610  9.785623  -0.96186667 12.034951
## 301       5.580653   1.0468121 10.114493  -1.35325804 12.514563
## 302       5.624763   0.7815926 10.467933  -1.78222687 13.031753
## 303       5.668873   0.4933485 10.844398  -2.24640883 13.584155
## 304       5.712984   0.1835550 11.242412  -2.74354779 14.169515
## 305       5.757094  -0.1464194 11.660608  -3.27155071 14.785739
## 306       5.801205  -0.4953282 12.097737  -3.82851128 15.430920
## 307       5.845315  -0.8620495 12.552679  -4.41271367 16.103343
## 308       5.889425  -1.2455805 13.024431  -5.02262437 16.801475
## 309       5.933536  -1.6450284 13.512100  -5.65687791 17.523949
## 310       5.977646  -2.0595993 14.014892  -6.31426008 18.269552
## 311       6.021756  -2.4885870 14.532100  -6.99369072 19.037204
## 312       6.065867  -2.9313620 15.063096  -7.69420731 19.825941
## 313       6.109977  -3.3873622 15.607317  -8.41495003 20.634905
## 314       6.154088  -3.8560837 16.164259  -9.15514837 21.463324
## 315       6.198198  -4.3370736 16.733470  -9.91410952 22.310506
## 316       6.242308  -4.8299229 17.314540 -10.69120812 23.175825
## 317       6.286419  -5.3342612 17.907099 -11.48587759 24.058715
## 318       6.330529  -5.8497515 18.510810 -12.29760259 24.958661
## 319       6.374640  -6.3760861 19.125365 -13.12591256 25.875192
## 320       6.418750  -6.9129830 19.750483 -13.97037625 26.807876
## 321       6.462860  -7.4601830 20.385904 -14.83059697 27.756318
## 322       6.506971  -8.0174464 21.031388 -15.70620851 28.720150
## 323       6.551081  -8.5845516 21.686714 -16.59687169 29.699034
## 324       6.595192  -9.1612925 22.351676 -17.50227131 30.692655
## 325       6.639302  -9.7474768 23.026081 -18.42211357 31.700718
## 326       6.683412 -10.3429252 23.709750 -19.35612383 32.722949
## 327       6.727523 -10.9474691 24.402515 -20.30404461 33.759090
## 328       6.771633 -11.5609504 25.104217 -21.26563391 34.808900
## 329       6.815744 -12.1832199 25.814707 -22.24066374 35.872151
## 330       6.859854 -12.8141370 26.533845 -23.22891876 36.948627
## 331       6.903964 -13.4535683 27.261497 -24.23019516 38.038124
## 332       6.948075 -14.1013874 27.997537 -25.24429963 39.140449
## 333       6.992185 -14.7574742 28.741845 -26.27104848 40.255419
## 334       7.036296 -15.4217144 29.494306 -27.31026682 41.382858
## 335       7.080406 -16.0939989 30.254811 -28.36178785 42.522600
## 336       7.124516 -16.7742235 31.023256 -29.42545225 43.674485
## 337       7.168627 -17.4622886 31.799542 -30.50110759 44.838361
## 338       7.212737 -18.1580986 32.583573 -31.58860781 46.014082
## 339       7.256848 -18.8615619 33.375257 -32.68781279 47.201508
## 340       7.300958 -19.5725906 34.174507 -33.79858792 48.400504
## 341       7.345068 -20.2910999 34.981237 -34.92080373 49.610941
## 342       7.389179 -21.0170084 35.795366 -36.05433553 50.832693
## 343       7.433289 -21.7502374 36.616816 -37.19906313 52.065642
## 344       7.477400 -22.4907111 37.445510 -38.35487056 53.309670
## 345       7.521510 -23.2383562 38.281376 -39.52164577 54.564666
## 346       7.565620 -23.9931020 39.124343 -40.69928045 55.830521
## 347       7.609731 -24.7548798 39.974341 -41.88766978 57.107131
## 348       7.653841 -25.5236234 40.831306 -43.08671225 58.394395
## 349       7.697952 -26.2992683 41.695171 -44.29630944 59.692213
## 350       7.742062 -27.0817522 42.565876 -45.51636591 61.000490
## 351       7.786172 -27.8710145 43.443359 -46.74678901 62.319134
## 352       7.830283 -28.6669963 44.327562 -47.98748875 63.648054
## 353       7.874393 -29.4696404 45.218427 -49.23837762 64.987164
## 354       7.918504 -30.2788912 46.115898 -50.49937056 66.336378
## 355       7.962614 -31.0946945 47.019922 -51.77038475 67.695613
## 356       8.006724 -31.9169977 47.930446 -53.05133955 69.064788
## 357       8.050835 -32.7457493 48.847419 -54.34215641 70.443826
## 358       8.094945 -33.5808993 49.770790 -55.64275875 71.832649
## 359       8.139056 -34.4223989 50.700510 -56.95307187 73.231183
## 360       8.183166 -35.2702003 51.636532 -58.27302288 74.639355
## 361       8.227276 -36.1242571 52.578810 -59.60254066 76.057093
## 362       8.271387 -36.9845238 53.527297 -60.94155570 77.484329
## 363       8.315497 -37.8509561 54.481950 -62.29000013 78.920994
## 364       8.359608 -38.7235105 55.442726 -63.64780757 80.367023
## 365       8.403718 -39.6021446 56.409580 -65.01491313 81.822349
#Lamda=0.6 dan gamma=0.3
des.2<- HoltWinters(train.ts, gamma = FALSE, beta = 0.3, alpha = 0.6)
plot(des.2)

#ramalan
ramalandes2<- forecast(des.2, h=73)
ramalandes2
##     Point Forecast       Lo 80      Hi 80        Lo 95     Hi 95
## 293       5.647471    2.164524   9.130418    0.3207629  10.97418
## 294       5.689173    1.272006  10.106341   -1.0663017  12.44465
## 295       5.730876    0.190910  11.270843   -2.7417710  14.20352
## 296       5.772579   -1.043327  12.588485   -4.6514491  16.19661
## 297       5.814282   -2.407244  14.035808   -6.7594565  18.38802
## 298       5.855985   -3.885065  15.597035   -9.0416655  20.75364
## 299       5.897688   -5.465690  17.261066  -11.4810995  23.27647
## 300       5.939390   -7.140919  19.019700  -14.0652163  25.94400
## 301       5.981093   -8.904421  20.866607  -16.7843359  28.74652
## 302       6.022796  -10.751129  22.796721  -19.6307080  31.67630
## 303       6.064499  -12.676863  24.805861  -22.5979408  34.72694
## 304       6.106202  -14.678093  26.890496  -25.6806338  37.89304
## 305       6.147905  -16.751777  29.047586  -28.8741355  41.16994
## 306       6.189607  -18.895253  31.274468  -32.1743757  44.55359
## 307       6.231310  -21.106163  33.568783  -35.5777471  48.04037
## 308       6.273013  -23.382394  35.928420  -39.0810179  51.62704
## 309       6.314716  -25.722036  38.351467  -42.6812674  55.31070
## 310       6.356419  -28.123349  40.836187  -46.3758357  59.08867
## 311       6.398122  -30.584740  43.380983  -50.1622850  62.95853
## 312       6.439824  -33.104740  45.984389  -54.0383684  66.91802
## 313       6.481527  -35.681988  48.645042  -58.0020048  70.96506
## 314       6.523230  -38.315218  51.361678  -62.0512588  75.09772
## 315       6.564933  -41.003249  54.133115  -66.1843234  79.31419
## 316       6.606636  -43.744974  56.958245  -70.3995058  83.61278
## 317       6.648338  -46.539352  59.836029  -74.6952153  87.99189
## 318       6.690041  -49.385405  62.765488  -79.0699533  92.45004
## 319       6.731744  -52.282206  65.745694  -83.5223043  96.98579
## 320       6.773447  -55.228879  68.775773  -88.0509280 101.59782
## 321       6.815150  -58.224593  71.854893  -92.6545532 106.28485
## 322       6.856853  -61.268558  74.982263  -97.3319713 111.04568
## 323       6.898555  -64.360020  78.157131 -102.0820318 115.87914
## 324       6.940258  -67.498264  81.378780 -106.9036370 120.78415
## 325       6.981961  -70.682602  84.646524 -111.7957387 125.75966
## 326       7.023664  -73.912380  87.959708 -116.7573340 130.80466
## 327       7.065367  -77.186970  91.317703 -121.7874623 135.91820
## 328       7.107070  -80.505768  94.719907 -126.8852023 141.09934
## 329       7.148772  -83.868196  98.165741 -132.0496691 146.34721
## 330       7.190475  -87.273699 101.654650 -137.2800125 151.66096
## 331       7.232178  -90.721741 105.186097 -142.5754141 157.03977
## 332       7.273881  -94.211807 108.759569 -147.9350855 162.48285
## 333       7.315584  -97.743400 112.374567 -153.3582666 167.98943
## 334       7.357287 -101.316040 116.030613 -158.8442239 173.55880
## 335       7.398989 -104.929264 119.727242 -164.3922490 179.19023
## 336       7.440692 -108.582624 123.464008 -170.0016572 184.88304
## 337       7.482395 -112.275687 127.240477 -175.6717858 190.63658
## 338       7.524098 -116.008033 131.056229 -181.4019936 196.45019
## 339       7.565801 -119.779258 134.910859 -187.1916593 202.32326
## 340       7.607503 -123.588965 138.803972 -193.0401806 208.25519
## 341       7.649206 -127.436775 142.735187 -198.9469732 214.24539
## 342       7.690909 -131.322315 146.704133 -204.9114700 220.29329
## 343       7.732612 -135.245225 150.710449 -210.9331202 226.39834
## 344       7.774315 -139.205156 154.753786 -217.0113887 232.56002
## 345       7.816018 -143.201768 158.833803 -223.1457549 238.77779
## 346       7.857720 -147.234729 162.950170 -229.3357127 245.05115
## 347       7.899423 -151.303717 167.102564 -235.5807692 251.37962
## 348       7.941126 -155.408419 171.290671 -241.8804448 257.76270
## 349       7.982829 -159.548528 175.514186 -248.2342719 264.19993
## 350       8.024532 -163.723748 179.772811 -254.6417950 270.69086
## 351       8.066235 -167.933786 184.066256 -261.1025697 277.23504
## 352       8.107937 -172.178361 188.394236 -267.6161628 283.83204
## 353       8.149640 -176.457196 192.756476 -274.1821512 290.48143
## 354       8.191343 -180.770019 197.152705 -280.8001222 297.18281
## 355       8.233046 -185.116569 201.582661 -287.4696723 303.93576
## 356       8.274749 -189.496587 206.046084 -294.1904076 310.73991
## 357       8.316452 -193.909821 210.542724 -300.9619431 317.59485
## 358       8.358154 -198.356025 215.072334 -307.7839022 324.50021
## 359       8.399857 -202.834959 219.634674 -314.6559166 331.45563
## 360       8.441560 -207.346387 224.229507 -321.5776262 338.46075
## 361       8.483263 -211.890078 228.856604 -328.5486783 345.51520
## 362       8.524966 -216.465807 233.515738 -335.5687278 352.61866
## 363       8.566668 -221.073353 238.206690 -342.6374367 359.77077
## 364       8.608371 -225.712498 242.929241 -349.7544738 366.97122
## 365       8.650074 -230.383032 247.683180 -356.9195148 374.21966

Selanjutnya jika ingin membandingkan plot data latih dan data uji adalah sebagai berikut.

#Visually evaluate the prediction
plot(data1.ts)
lines(des.1$fitted[,1], lty=2, col="blue")
lines(ramalandes1$mean, col="red")

Untuk mendapatkan nilai parameter optimum dari DES, argumen alpha dan beta dapat dibuat NULL seperti berikut.

#Lamda dan gamma optimum
des.opt<- HoltWinters(train.ts, gamma = FALSE)
des.opt
## Holt-Winters exponential smoothing with trend and without seasonal component.
## 
## Call:
## HoltWinters(x = train.ts, gamma = FALSE)
## 
## Smoothing parameters:
##  alpha: 0.2446006
##  beta : 0.1028716
##  gamma: FALSE
## 
## Coefficients:
##         [,1]
## a 5.32758702
## b 0.02106041
plot(des.opt)

#ramalan
ramalandesopt<- forecast(des.opt, h=73)
ramalandesopt
##     Point Forecast        Lo 80     Hi 80        Lo 95     Hi 95
## 293       5.348647   2.33460069  8.362694   0.73906072  9.958234
## 294       5.369708   2.24791762  8.491498   0.59534170 10.144074
## 295       5.390768   2.14488602  8.636650   0.42661976 10.354917
## 296       5.411829   2.02560462  8.798053   0.23304590 10.590611
## 297       5.432889   1.89038104  8.975397   0.01509057 10.850688
## 298       5.453949   1.73967904  9.168220  -0.22653695 11.134436
## 299       5.475010   1.57406872  9.375951  -0.49096479 11.440985
## 300       5.496070   1.39418314  9.597957  -0.77722475 11.769365
## 301       5.517131   1.20068341  9.833578  -1.08430576 12.118567
## 302       5.538191   0.99423219 10.082150  -1.41119435 12.487577
## 303       5.559251   0.77547496 10.343028  -1.75690336 12.875406
## 304       5.580312   0.54502770 10.615596  -2.12049075 13.281115
## 305       5.601372   0.30346956 10.899275  -2.50107074 13.703815
## 306       5.622433   0.05133937 11.193526  -2.89781929 14.142685
## 307       5.643493  -0.21086536 11.497852  -3.30997553 14.596962
## 308       5.664554  -0.48268761 11.811795  -3.73684050 15.065948
## 309       5.685614  -0.76370898 12.134937  -4.17777430 15.549002
## 310       5.706674  -1.05354711 12.466896  -4.63219218 16.045541
## 311       5.727735  -1.35185291 12.807322  -5.09956023 16.555030
## 312       5.748795  -1.65830763 13.155898  -5.57939099 17.076981
## 313       5.769856  -1.97262004 13.512331  -6.07123904 17.610950
## 314       5.790916  -2.29452375 13.876356  -6.57469699 18.156529
## 315       5.811976  -2.62377471 14.247727  -7.08939158 18.713344
## 316       5.833037  -2.96014892 14.626222  -7.61498025 19.281054
## 317       5.854097  -3.30344038 15.011635  -8.15114794 19.859342
## 318       5.875158  -3.65345924 15.403774  -8.69760429 20.447919
## 319       5.896218  -4.01003011 15.802466  -9.25408109 21.046517
## 320       5.917278  -4.37299062 16.207547  -9.82032999 21.654887
## 321       5.938339  -4.74219008 16.618868 -10.39612054 22.272798
## 322       5.959399  -5.11748832 17.036287 -10.98123838 22.900037
## 323       5.980460  -5.49875467 17.459674 -11.57548365 23.536403
## 324       6.001520  -5.88586702 17.888907 -12.17866961 24.181710
## 325       6.022580  -6.27871101 18.323872 -12.79062135 24.835782
## 326       6.043641  -6.67717933 18.764461 -13.41117476 25.498456
## 327       6.064701  -7.08117103 19.210574 -14.04017545 26.169578
## 328       6.085762  -7.49059100 19.662114 -14.67747797 26.849001
## 329       6.106822  -7.90534943 20.118994 -15.32294496 27.536589
## 330       6.127882  -8.32536136 20.581126 -15.97644647 28.232211
## 331       6.148943  -8.75054626 21.048432 -16.63785937 28.935745
## 332       6.170003  -9.18082770 21.520834 -17.30706676 29.647073
## 333       6.191064  -9.61613299 21.998260 -17.98395745 30.366085
## 334       6.212124 -10.05639290 22.480641 -18.66842558 31.092674
## 335       6.233184 -10.50154138 22.967910 -19.36037014 31.826739
## 336       6.254245 -10.95151535 23.460005 -20.05969464 32.568184
## 337       6.275305 -11.40625446 23.956865 -20.76630680 33.316917
## 338       6.296366 -11.86570089 24.458432 -21.48011819 34.072850
## 339       6.317426 -12.32979921 24.964651 -22.20104403 34.835896
## 340       6.338487 -12.79849618 25.475469 -22.92900289 35.605976
## 341       6.359547 -13.27174060 25.990834 -23.66391648 36.383010
## 342       6.380607 -13.74948322 26.510698 -24.40570947 37.166924
## 343       6.401668 -14.23167657 27.035012 -25.15430927 37.957645
## 344       6.422728 -14.71827489 27.563731 -25.90964588 38.755102
## 345       6.443789 -15.20923397 28.096811 -26.67165170 39.559229
## 346       6.464849 -15.70451112 28.634209 -27.44026145 40.369959
## 347       6.485909 -16.20406505 29.175884 -28.21541196 41.187231
## 348       6.506970 -16.70785580 29.721795 -28.99704212 42.010982
## 349       6.528030 -17.21584464 30.271905 -29.78509272 42.841153
## 350       6.549091 -17.72799405 30.826175 -30.57950636 43.677688
## 351       6.570151 -18.24426761 31.384570 -31.38022734 44.520529
## 352       6.591211 -18.76462997 31.947053 -32.18720160 45.369624
## 353       6.612272 -19.28904680 32.513590 -33.00037663 46.224920
## 354       6.633332 -19.81748470 33.084149 -33.81970138 47.086366
## 355       6.654393 -20.34991122 33.658696 -34.64512616 47.953911
## 356       6.675453 -20.88629473 34.237201 -35.47660266 48.827509
## 357       6.696513 -21.42660446 34.819631 -36.31408379 49.707111
## 358       6.717574 -21.97081043 35.405958 -37.15752370 50.592671
## 359       6.738634 -22.51888339 35.996152 -38.00687767 51.484146
## 360       6.759695 -23.07079483 36.590184 -38.86210208 52.381491
## 361       6.780755 -23.62651692 37.188027 -39.72315438 53.284665
## 362       6.801815 -24.18602248 37.789653 -40.58999301 54.193624
## 363       6.822876 -24.74928499 38.395037 -41.46257739 55.108329
## 364       6.843936 -25.31627851 39.004151 -42.34086785 56.028740
## 365       6.864997 -25.88697769 39.616971 -43.22482563 56.954819

Selanjutnya akan dilakukan perhitungan akurasi pada data latih maupun data uji dengan ukuran akurasi SSE, MSE dan MAPE.

Akurasi Data Latih

#Akurasi Data Training
ssedes.train1<-des.1$SSE
msedes.train1<-ssedes.train1/length(train.ts)
sisaandes1<-ramalandes1$residuals
head(sisaandes1)
## Time Series:
## Start = 1 
## End = 6 
## Frequency = 1 
## [1]        NA        NA -0.550000  0.472000  1.980720  2.878467
mapedes.train1 <- sum(abs(sisaandes1[3:length(train.ts)]/train.ts[3:length(train.ts)])
                      *100)/length(train.ts)

akurasides.1 <- matrix(c(ssedes.train1,msedes.train1,mapedes.train1))
row.names(akurasides.1)<- c("SSE", "MSE", "MAPE")
colnames(akurasides.1) <- c("Akurasi lamda=0.2 dan gamma=0.2")
akurasides.1
##      Akurasi lamda=0.2 dan gamma=0.2
## SSE                      1629.860570
## MSE                         5.581714
## MAPE                       34.873688
ssedes.train2<-des.2$SSE
msedes.train2<-ssedes.train2/length(train.ts)
sisaandes2<-ramalandes2$residuals
head(sisaandes2)
## Time Series:
## Start = 1 
## End = 6 
## Frequency = 1 
## [1]       NA       NA -0.55000  0.76900  1.86818  1.74158
mapedes.train2 <- sum(abs(sisaandes2[3:length(train.ts)]/train.ts[3:length(train.ts)])
                      *100)/length(train.ts)

akurasides.2 <- matrix(c(ssedes.train2,msedes.train2,mapedes.train2))
row.names(akurasides.2)<- c("SSE", "MSE", "MAPE")
colnames(akurasides.2) <- c("Akurasi lamda=0.6 dan gamma=0.3")
akurasides.2
##      Akurasi lamda=0.6 dan gamma=0.3
## SSE                      2134.727508
## MSE                         7.310711
## MAPE                       39.884722

Hasil akurasi dari data latih didapatkan skenario 2 dengan lamda=0.6 dan gamma=0.3 memiliki hasil yang lebih buru Sehingga kedua skenario dikategorikan sebagai peramalan yang kurang baik berdasarkan nilai MAPE-nya.

Akurasi Data Uji

#Akurasi Data Testing
selisihdes1<-ramalandes1$mean-testing$WS50M
selisihdes1
## Time Series:
## Start = 293 
## End = 365 
## Frequency = 1 
##  [1] -2.10223065  0.34187974 -0.99400986 -2.66989946  2.06421093 -0.30167867
##  [7] -0.98756827  2.96654213 -0.64934748 -1.71523708  2.34887332  3.16298371
## [13]  2.47709411  0.97120451  1.36531491  2.20942530  1.66353570  1.67764610
## [19]  0.68175649  0.94586689  0.37997729 -1.35591232 -1.32180192 -0.38769152
## [25]  2.63641888  0.92052927  2.36463967  1.36875007  2.54286046  3.92697086
## [31]  3.46108126  1.36519166  4.23930205  0.13341245  0.81752285 -0.62836676
## [37]  0.36574364  2.85985404 -1.06603557  0.02807483  2.90218523  0.38629563
## [43] -3.77959398  2.06451642  1.02862682  1.01273721  1.91684761  2.33095801
## [49] -0.35493159 -4.01082120  0.06328920  4.30739960  4.77150999  1.71562039
## [55] -0.14026921  5.58384118  3.76795158  3.63206198  0.32617238  0.19028277
## [61]  3.92439317  4.59850357  3.80261396  0.99672436  1.96083476  4.50494516
## [67]  2.36905555  1.36316595  3.68727635  2.66138674  3.76549714  1.18960754
## [73]  2.89371793
SSEtestingdes1<-sum(selisihdes1^2)
MSEtestingdes1<-SSEtestingdes1/length(testing$WS50M)
MAPEtestingdes1<-sum(abs(selisihdes1/testing$WS50M)*100)/length(testing$WS50M)

selisihdes2<-ramalandes2$mean-testing$WS50M
selisihdes2
## Time Series:
## Start = 293 
## End = 365 
## Frequency = 1 
##  [1] -1.68252937  0.75917346 -0.57912371 -2.25742089  2.47428194  0.10598477
##  [7] -0.58231241  3.36939042 -0.24890676 -1.31720393  2.74449890  3.55620172
## [13]  2.86790455  1.35960738  1.75131020  2.59301303  2.04471586  2.05641868
## [19]  1.05812151  1.31982434  0.75152716 -0.98677001 -0.95506718 -0.02336436
## [25]  2.99833847  1.28004130  2.72174412  1.72344695  2.89514978  4.27685260
## [31]  3.80855543  1.71025825  4.58196108  0.47366391  1.15536673 -0.29293044
## [37]  0.69877239  3.19047521 -0.73782196  0.35388087  3.22558369  0.70728652
## [43] -3.46101065  2.38069217  1.34239500  1.32409783  2.22580065  2.63750348
## [49] -0.05079369 -3.70909087  0.36261196  4.60431479  5.06601761  2.00772044
## [55]  0.14942327  5.87112609  4.05282892  3.91453174  0.60623457  0.46793740
## [61]  4.19964022  4.87134305  4.07304588  1.26474870  2.22645153  4.76815436
## [67]  2.62985718  1.62156001  3.94326284  2.91496566  4.01666849  1.43837132
## [73]  3.14007414
SSEtestingdes2<-sum(selisihdes2^2)
MSEtestingdes2<-SSEtestingdes2/length(testing$WS50M)
MAPEtestingdes2<-sum(abs(selisihdes2/testing$WS50M)*100)/length(testing$WS50M)

selisihdesopt<-ramalandesopt$mean-testing$WS50M
selisihdesopt
## Time Series:
## Start = 293 
## End = 365 
## Frequency = 1 
##  [1] -1.98135257  0.43970784 -0.91923176 -2.61817135  2.09288905 -0.29605054
##  [7] -1.00499013  2.92607027 -0.71286932 -1.80180891  2.23925149  3.03031190
## [13]  2.32137231  0.79243271  1.16349312  1.98455353  1.41561393  1.40667434
## [19]  0.38773474  0.62879515  0.03985556 -1.71908404 -1.70802363 -0.79696322
## [25]  2.20409718  0.46515759  1.88621800  0.86727840  2.01833881  3.37939922
## [31]  2.89045962  0.77152003  3.62258043 -0.50635916  0.15470125 -1.31423835
## [37] -0.34317794  2.12788247 -1.82105713 -0.74999672  2.10106369 -0.43787591
## [43] -4.62681550  1.19424491  0.13530531  0.09636572  0.97742612  1.36848653
## [49] -1.34045306 -5.01939266 -0.96833225  3.25272816  3.69378856  0.61484897
## [55] -1.26409062  4.43696978  2.59803019  2.43909060 -0.88984900 -1.04878859
## [61]  2.66227181  3.31333222  2.49439263 -0.33454697  0.60651344  3.12757385
## [67]  0.96863425 -0.06030534  2.24075507  1.19181547  2.27287588 -0.32606371
## [73]  1.35499669
SSEtestingdesopt<-sum(selisihdesopt^2)
MSEtestingdesopt<-SSEtestingdesopt/length(testing$WS50M)
MAPEtestingdesopt<-sum(abs(selisihdesopt/testing$WS50M)*100)/length(testing$WS50M)

akurasitestingdes <-
  matrix(c(SSEtestingdes1,MSEtestingdes1,MAPEtestingdes1,SSEtestingdes2,MSEtestingdes2,
           MAPEtestingdes2,SSEtestingdesopt,MSEtestingdesopt,MAPEtestingdesopt),
         nrow=3,ncol=3)
row.names(akurasitestingdes)<- c("SSE", "MSE", "MAPE")
colnames(akurasitestingdes) <- c("des ske1","des ske2","des opt")
akurasitestingdes
##        des ske1   des ske2    des opt
## SSE  435.587728 505.847388 291.383469
## MSE    5.966955   6.929416   3.991554
## MAPE  49.724307  54.655150  39.486424

Perbandingan SES dan DES

MSEfull <-
  matrix(c(MSEtesting1,MSEtesting2,MSEtestingopt,MSEtestingdes1,MSEtestingdes2,
           MSEtestingdesopt),nrow=3,ncol=2)
row.names(MSEfull)<- c("ske 1", "ske 2", "ske opt")
colnames(MSEfull) <- c("ses","des")
MSEfull
##              ses      des
## ske 1   127.3755 5.966955
## ske 2   129.5004 6.929416
## ske opt 129.0958 3.991554

Kedua metode dapat dibandingkan dengan menggunakan ukuran akurasi yang sama. Contoh di atas adalah perbandingan kedua metode dengan ukuran akurasi MSE. Hasilnya didapatkan metode DES lebih baik dibandingkan metode SES dilihat dari MSE yang lebih kecil nilainya.